ugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square297fedilinkarrow-up11.02Karrow-down116 cross-posted to: [email protected]
arrow-up11Karrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agomessage-square297fedilink cross-posted to: [email protected]
minus-squareCorhen@lemmy.worldlinkfedilinkEnglisharrow-up24·7 months agohad the exact same thought. If you wanted it to be unbiased, you wouldnt tell it its position in a lot of items.
minus-squareSeasoned_Greetings@lemm.eelinkfedilinkEnglisharrow-up34·edit-27 months agoNo you see, that instruction “you are unbiased and impartial” is to relay to the prompter if it ever becomes relevant. Basically instructing the AI to lie about its biases, not actually instructing it to be unbiased and impartial
minus-squaremelpomenesclevage@lemm.eelinkfedilinkEnglisharrow-up5·7 months agoNo but see ‘unbiased’ is an identity and social group, not a property of the thing.
had the exact same thought.
If you wanted it to be unbiased, you wouldnt tell it its position in a lot of items.
No you see, that instruction “you are unbiased and impartial” is to relay to the prompter if it ever becomes relevant.
Basically instructing the AI to lie about its biases, not actually instructing it to be unbiased and impartial
No but see ‘unbiased’ is an identity and social group, not a property of the thing.