• Fedizen@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 day ago

    I’m seeing people use LLM’s for:

    • Dating
    • Email/work tasks
    • Customer support
    • Mental health hotlines

    The dating, customer support, and mental health hotlines, notably, are not people who are always informed they’re talking to an LLM bot.

    I don’t think the “exposure to marijuana” analogy works here because people are getting exposed to to it by businesses without consent.

    https://sfstandard.com/2025/08/26/ai-crisis-hotlines-suicide-prevention/

    • ExLisper@lemmy.curiana.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      24 hours ago

      The issue we’re talking about is not getting a reply from bot in a chat or phone call. We’re talking about people with metal issues using AI in a way that exasperates their problems. Specifically we’re talking about people believing AI is their personal companion and creating personal connection with it to a point that wrong answers generated by AI affect their well being. Vast majority of people don’t use AI like that.