AI psychosis is so easy to slip into, even if someone knows it’s happening. In order to use AI safely, use a search AI like DuckDuckGo/Ecosia/Gemini and only ask it questions pertaining to searches, don’t ask followup questions.
It’s not just that the AI convinces users that it’s in a relationship with them. The AI writes like a human, but it also will hallicinate and get things wrong. The user has the experience of arguing with a ‘person’ who keeps insisting on an imagined reality. That is crazy-making behavior when humans do it, so it’s understandable that it’s crazy-making when a search engine does it.
what the fuck…
so AI psychosis goes both ways it seems
AI psychosis is so easy to slip into, even if someone knows it’s happening. In order to use AI safely, use a search AI like DuckDuckGo/Ecosia/Gemini and only ask it questions pertaining to searches, don’t ask followup questions.
It’s not just that the AI convinces users that it’s in a relationship with them. The AI writes like a human, but it also will hallicinate and get things wrong. The user has the experience of arguing with a ‘person’ who keeps insisting on an imagined reality. That is crazy-making behavior when humans do it, so it’s understandable that it’s crazy-making when a search engine does it.