- cross-posted to:
- [email protected]
- cross-posted to:
- [email protected]
Man, they can make a chatbot that makes people fall in love with it and drive them insane, but they can’t even make ONE really good blowjob machine? SMH.
Why would we need a blowjob machine when Ur mom exists?
Fuck, I’d settle for a printer that just did its job
tbh seems a little unsafe, blowjob from your printer
Subscribing for future updates.
And then AWS comes back online, but the transient state was wiped and now ‘she’ no longer remembers you. That’s a plot for a sci-fi short film right there. You’re welcome, Hollywood.
“Fifty First Reboots” starring Adam Sandler
How would you even know it forgot you?
Do you remember me?
You’re absolutely right…
- Every AI in existence
I think there is a bit of nuance to it. The AI usually rereads the chatlog to “remember” the past conversation and generates the answer based on that+your prompt. I’m not sure how they handle long chat histories, there might very well be a “condensed” form of the chat + the last 50 actually messages + the current prompt. If that condensed form is transient then the AI will forget most of the conversation on a crash but will never admit it. So the personality will change because it lost a lot of the background. Or maybe they update the AI so it interprets that condensed form differently
deleted by creator
What if it just glitches a bit and replaces the default personality so now she’s everyone’s girlfriend?
You’ve met my ex?
Everyone has.
That’s already the case. She’s just not being honest about it. Buy hey, this is the 21st century – If guys want to share… servers, who am I to kink shame them?
Okay but only if I’m the last guy to use the… server.
Eternal Sunshine of the Spotless Mind, but one-sided. Horror or scy-fi?
More existential dread than outright horror, but other than that – why not both? Or, well, since we’re currently living in it, can it really be called sci-fi at this point?
<details> <summary>Spoiler warning for a Becky Chambers book</summary> There’s a scene in https://en.wikipedia.org/wiki/The_Long_Way_to_a_Small%2C_Angry_Planet where they are worried that something like that might happen. </details>
Spoiler tags on lemmy work slightly differently.
Is it not hidden? I’m on a pie-fed instance, and it works fine for me.
Ah, that’s fun to know. I know it’s markdown that works on Github at least, but it doesn’t work on my lemmy instance at least. For lemmy, what I linked is what works. Weird that piefed has chosen different syntax, or maybe both works?
test
test
What is this xml in my markdown
Gotta love Github style spoiler tags.
Thanks for the recommendation.
If you have legit delusions about chatbot romantic partners, you need therapy like a year ago
Like…AI therapy?
we do ai couple’s ai therapy
If we had better systems in place to help everyone who needs it, this probably wouldn’t be a problem. Telling someone they need therapy isn’t helpful, it’s just acknowledging we aren’t aiding the ones who need it when they need it most.
I’ll go further and say anyone who thinks any of these AI are really what they’re marketed as needs help, as in education of what is and isn’t possible. So that will cover all instances, not just the romantic variety.
Careful, you should probably specify that therapy from a chatbot does not count.
“help I’ve fallen in love with my therapist!” recursive error
The Daily: She Fell in Love With ChatGPT. Like, Actual Love. With Sex.
article: https://www.nytimes.com/2025/01/15/technology/ai-chatgpt-boyfriend-companion.html
podcast: https://www.nytimes.com/2025/02/25/podcasts/the-daily/ai-chatgpt-boyfriend-relationship.html
I don’t think ppl with AI girlfriends have delusions of them being human or whatever. They know it’s AI, thought they may ascribe some human feeling that isn’t there
But also, end of day, maybe it doesn’t matter to the as long as the model can still provide them emotional support
There will come a time when your AI girlfriend’s context window fills up and its responses become increasingly unhinged and nonsensical. The average person doesn’t know to expect that though, so it probably is pretty harmful when someone’s emotional support robot suddenly goes insane
There will come a time when your AI girlfriend’s context window fills up and its responses become increasingly unhinged and nonsensical.
Wait… So I have already been dating AIs and the didn’t even know it? This explains a lot.
Have you asked her to do simple arithmetic calculations? LLMs can’t do that.
Odds are, people who have delusions about romantic partners thanks to the ELIZA effect are probably either too poor or would be resistant to getting professional help.
When you discover you’re running on AWS.
Oh, you flatter me! ☺️
At best I’m running on a Raspberry Pi 3.