Welcome to our cyberpunk future
Welcome to our cyberpunk future
Welcome to our cyberpunk future
If you have legit delusions about chatbot romantic partners, you need therapy like a year ago
If we had better systems in place to help everyone who needs it, this probably wouldn't be a problem. Telling someone they need therapy isn't helpful, it's just acknowledging we aren't aiding the ones who need it when they need it most.
I'll go further and say anyone who thinks any of these AI are really what they're marketed as needs help, as in education of what is and isn't possible. So that will cover all instances, not just the romantic variety.
Careful, you should probably specify that therapy from a chatbot does not count.
"help I've fallen in love with my therapist!" recursive error
I don't think therapy can cure Stupid.
The Daily: She Fell in Love With ChatGPT. Like, Actual Love. With Sex.
article: https://www.nytimes.com/2025/01/15/technology/ai-chatgpt-boyfriend-companion.html
podcast: https://www.nytimes.com/2025/02/25/podcasts/the-daily/ai-chatgpt-boyfriend-relationship.html
I don't think ppl with AI girlfriends have delusions of them being human or whatever. They know it's AI, thought they may ascribe some human feeling that isn't there
But also, end of day, maybe it doesn't matter to the as long as the model can still provide them emotional support
Odds are, people who have delusions about romantic partners thanks to the ELIZA effect are probably either too poor or would be resistant to getting professional help.
Man, they can make a chatbot that makes people fall in love with it and drive them insane, but they can't even make ONE really good blowjob machine? SMH.
Why would we need a blowjob machine when Ur mom exists?
Subscribing for future updates.
they're trying the wrong model. you need to use a leech's mouth
When you discover you're running on AWS.
Oh, you flatter me! ☺️
At best I'm running on a Raspberry Pi 3.
if your girlfriend was on aws us east 1 it means it was not only your girlfriend, real ai girlfriends are self hosted
Can someone explain 100% of the comtext here, because I am completely lost. Im guessing AWS is a server? And that is a guy laying in the snow?
The picture is the main character from Blade Runner 2049, who is in love with a digital woman that he loses when the harddrive she lives on is destroyed.
AWS means Amazon Web Services, the main cloud infrastructure of the world afaik. It recently had an outage, possibly erasing people's AI partners.
In fairness to K, his pretend girlfriend is at least Ana de Armas, rather than some anime weeaboobs.
AWS server US-East 1 went down yesterday, causing a massive internet blackout. It was an internal oopsie, someone did something that caused a problem with DNS resolution. Basically, you go looking for something and the server was like "Uh...Oh, I should know this? Oh, shoot. Uh.... uh....I give up,I dunno."
The screencap is from the end of Blade Runner 2049, a movie about humans and "replicants" that are really good cyborgs. I won't spoil the movie for you, but it's a joke about AI in a context about AI kind of thing.
When you need a girlfriend lmao. Have you considered...not needing one?
And then AWS comes back online, but the transient state was wiped and now 'she' no longer remembers you. That's a plot for a sci-fi short film right there. You're welcome, Hollywood.
How would you even know it forgot you?
I think there is a bit of nuance to it. The AI usually rereads the chatlog to "remember" the past conversation and generates the answer based on that+your prompt. I'm not sure how they handle long chat histories, there might very well be a "condensed" form of the chat + the last 50 actually messages + the current prompt. If that condensed form is transient then the AI will forget most of the conversation on a crash but will never admit it. So the personality will change because it lost a lot of the background. Or maybe they update the AI so it interprets that condensed form differently
"Fifty First Reboots" starring Adam Sandler
That does sound like how Hollywood would handle it, yeah.
<details> <summary>Spoiler warning for a Becky Chambers book</summary> There's a scene in https://en.wikipedia.org/wiki/The_Long_Way_to_a_Small%2C_Angry_Planet where they are worried that something like that might happen. </details>
What is this xml in my markdown
Spoiler tags on lemmy work slightly differently.
Thanks for the recommendation.
That's great because we were fighting and I was in really big trouble.