ylai@lemmy.ml to AI@lemmy.mlEnglish · 2 years agoAI chatbots tend to choose violence and nuclear strikes in wargameswww.newscientist.comexternal-linkmessage-square21fedilinkarrow-up152arrow-down15cross-posted to: [email protected][email protected][email protected]
arrow-up147arrow-down1external-linkAI chatbots tend to choose violence and nuclear strikes in wargameswww.newscientist.comylai@lemmy.ml to AI@lemmy.mlEnglish · 2 years agomessage-square21fedilinkcross-posted to: [email protected][email protected][email protected]
minus-squareFaceDeer@kbin.sociallinkfedilinkarrow-up7·2 years agoI wouldn’t be surprised if this actually factors into this outcome. AI is trying to do what humans expect it to do, and our fiction is full of AIs that turn violent.
minus-squareaveryminya@beehaw.orglinkfedilinkarrow-up1·2 years agoNot to mention humans tendencies towards violence
I wouldn’t be surprised if this actually factors into this outcome. AI is trying to do what humans expect it to do, and our fiction is full of AIs that turn violent.
Not to mention humans tendencies towards violence