• BootyEnthusiast@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    21
    ·
    2 days ago

    TL;DR: Teen used a common jailbreak method to get it to discuss suicide the way he wanted it to. Company argues that it’s not responsible because intentional jailbreaking is against the TOS.

    • skuzz@discuss.tchncs.de
      link
      fedilink
      arrow-up
      4
      ·
      2 days ago

      I’d not say typing a sentence is a “jailbreak” of any sort - but more so, LLMs should just straight not allow certain topics until some future where it is decided and regulated how they respond. Although at this point, I’d gladly lose my coding assistant in trade for making LLMs go away. Big tech is again being reckless with yet again little to no accountability. They need their toys taken away.

      • mindbleach@sh.itjust.works
        link
        fedilink
        arrow-up
        1
        ·
        11 hours ago

        LLMs should just straight not allow certain topics

        They already don’t. It doesn’t work. That’s what happened here.