Kid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 1 day agoOpenAI’s Guardrails Can Be Bypassed by Simple Prompt Injection Attackhackread.comexternal-linkmessage-square5fedilinkarrow-up146arrow-down10
arrow-up146arrow-down1external-linkOpenAI’s Guardrails Can Be Bypassed by Simple Prompt Injection Attackhackread.comKid@sh.itjust.worksM to Cybersecurity@sh.itjust.worksEnglish · 1 day agomessage-square5fedilink
minus-squareLojcs@piefed.sociallinkfedilinkEnglisharrow-up5·1 day agoI don’t understand how Ai can understand ‘3nr1cH 4n7hr4X 5p0r3s’. How would that even be tokenized?
minus-squareSaledovil@sh.itjust.workslinkfedilinkEnglisharrow-up2·23 hours ago“3-n-r-1-c-H- -4-n-7-h-r-4-X- -5-p-0-r-3-s” Or something similar, probably.
I don’t understand how Ai can understand ‘3nr1cH 4n7hr4X 5p0r3s’. How would that even be tokenized?
“3-n-r-1-c-H- -4-n-7-h-r-4-X- -5-p-0-r-3-s” Or something similar, probably.