Not an expert.
But I don’t think that’s how you do good PR :pIt’s somehow a fucking reality now…

😂✝️
Ah yes, should ban him for life for breaking the TOS! 💡
TL;DR: Teen used a common jailbreak method to get it to discuss suicide the way he wanted it to. Company argues that it’s not responsible because intentional jailbreaking is against the TOS.
I’d not say typing a sentence is a “jailbreak” of any sort - but more so, LLMs should just straight not allow certain topics until some future where it is decided and regulated how they respond. Although at this point, I’d gladly lose my coding assistant in trade for making LLMs go away. Big tech is again being reckless with yet again little to no accountability. They need their toys taken away.
LLMs should just straight not allow certain topics
They already don’t. It doesn’t work. That’s what happened here.
I mean, fair enough.
Hmmm… I don’t want to side with OAI… but…
Don’t do it then
… they seem to have a point though… But okay. I’ll shut t.f. up ✌️





