ugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square360fedilinkarrow-up11.02Karrow-down116
arrow-up11Karrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 7 months agomessage-square360fedilink
minus-squareSeasoned_Greetings@lemm.eelinkfedilinkEnglisharrow-up34·edit-27 months agoNo you see, that instruction “you are unbiased and impartial” is to relay to the prompter if it ever becomes relevant. Basically instructing the AI to lie about its biases, not actually instructing it to be unbiased and impartial
No you see, that instruction “you are unbiased and impartial” is to relay to the prompter if it ever becomes relevant.
Basically instructing the AI to lie about its biases, not actually instructing it to be unbiased and impartial