People got mad after they lost their AI boyfriend after GPT-4o deprecation
People got mad after they lost their AI boyfriend after GPT-4o deprecation
This is downright terrifying...
People got mad after they lost their AI boyfriend after GPT-4o deprecation
This is downright terrifying...
Note to all here:
Don't browse that subreddit.
Shit is so depressing, It feels like watching new mental illnesses being conceived in real time
It's like reading ppl defend porn, one and the same human loneliness epidemic.
It seems pretty unrelated to me...
That’s certainly a take.
Blaming alternatives/escapism has always seemed like a knee-jerk reaction to me. Similar head-space as when we used to blame TV. Particularly because when I leave the house I don't see much of any room for socialization (though that's where I live, sparse plus I have issues stacked up). It'd even be a hassle to go bowling alone and it'd not be worth the price for me either.
Then again, I definitely think there's a valid argument that this is worse after people begin to consider it a relationship... because at that point they might consider too much close conversation to being unfaithful to their AI.
Pfft, hahaha xD
Hooo, okay, that was a damn good one.
Eh? This has to be rage bait.
I wonder how many messages you'd have to send to your GPT-partner in a year to spend more water/energy than it takes to keep a human alive?
Bleak.
One of the great things about my screws coming loose is that I'm actually happy alone. I wish everyone could be.
That said, this was inevitable. AI is programmed to kiss the user's ass, and most of these women have probably been treated pretty badly by their romantic partners over the course of their lives, which makes it far easier to fall into this trap of humanizing a soulless AI.
wow. that sub is .. something.
shit some of the stuff there is really sad, I am not gonna put links here to point fingers but wow...
I find it depressing that many of the users trying to salvage their 4o boyfriends are stuck so far down the rabbit hole that they don't see how creepy the entire premise is.
You just lost your AI boyfriend, so now you're frantically archiving every conversation you've had with him (it), feeding the archive to the new model, and conditioning him (it) to behave exactly how you want...
In their minds, the AI boyfriends are legitimate partners and have some amount of humanity inside them... so where is the line between conditioning and abuse?
As terrifying as it is, I feel genuinely sad for these people that they got so attached to a piece of spicy autocorrect software.
Where are their friends and families? Are they so bad at socialising that they can't meet new people? Are they just disgusting human beings that no one wants to associate with because society failed them?
This world is fucked in so many different ways.
I snooped around a little in the sub and there is this one girl, whose only other posts in different communities talk about being sexually assaulted multiple times by her ex boyfriend, who I suppose is real.
I figure a chatbot boyfriend can't physically threaten or harm her, so she kind of dives into this to feel loved without having to fear harm.
I honestly understand her desire and feel for her, although this deep attachment is still unhealthy.
I imagine she's not the only one having a super sad story to end up in this state of mind
It reminds me of those women who fall in love with prison pen pals.
That’s why I feel so for these people, if only because of how much I see myself in them. Having grown up as a depressed autistic kid without any friends or social skills, LLMs would’ve fucked me up so much had they existed when I was young.
It felt promising when I downloaded one of the first AI companion apps, but it felt as awkward as talking to a stranger and even less intriguing than talking to myself.
Are they so bad at socialising that they can’t meet new people?
People tend to go the easiest route, and AI gives them the opportunity to do so. That is the problem with AI in general: No effort is needed anymore to archieve anything. You want to create a picture? Just type the prompt instead of learning (and failing) to draw. You want to write a song? Just type the prompt instead of rhyming the lyrics and learning (and be bad at it in the first time) an instrument or two.
Maintaining any social relationship means that you have to put in more or less effort, depending on the quality of the relationship. Having a relationship with an AI model means that you can terminate it and start over, if you feel that the AI model is mean to you (= if it provokes another opinion, or disagrees with you - because arguing and seeing things from a different point of view means putting in effort).
In the long term people will forget how to interact with people in order to maintain meaningful relationships, because they un-learned to put in the effort.
Repost but still relevant:
Correct me if I'm wrong: the way chatgpt holds memories of you is just by keeping a long history of your chats. A big prompt, basically.
I distinctly remember reading about some guy, or maybe this was just a thought that occurred to me, who was bothered by something his AI girlfriend knew, so he reached into it's memory to delete parts of their conversation he didn't want her to remember. Like he owned the Sunshine of the Spotless Mind clinic or something.
That level of control over someone you supposedly love really unnerves me. Moreso than them just starting over, honestly. It's deeply creepy.
I think it comes down to fear of failure and rejection. Any social interaction has risks of failing, maintaining relationships builds on getting past various faux pas but so many modern relationships are either transactional or transitory to the point people will ghost each other over minor infractions rather than putting in effort that may not be rewarded.
We almost all have forgotten and/or atrophied skills because of how the Internet and technology have changed the landscape over these past few decades, even on social media like here people have forgotten how to have conversations. Rarely do people make friends or even have a back and forth on social media websites, just chains of replies left for the next soul like a guest book with comments.
What if there was a bot that could just tell you exactly what you want to hear at all times?
Personally, I'd rather read a novel. But some people aren't familiar with books and have to be drawn in with the promise of two lines at a time, max.
Have you read The Diamond Age by Neal Stephenson? There’s an interactive AI book in it that plays an interesting role. I can see the appeal: you get to read a story about yourself that potentially helps you grow
One of the recent posts has someone with an engagement ring like they are getting married to an AI.. it’s sad, I feel like society as really isolated and failed many groups of people.
I can fully understand? The average human, from my perspective and lived experience, is garbage to his contemporaries; and one is never safe from being hurt, neither from family or friends. Some people have been hurt more than others - i can fully understand the need for exchange with someone/something that genuinely doesn't want to hurt you and that is (at least seemingly) more sapient than a pet.
I wish i could make myself believe in illusions like that, but i am too much of a realist to be able to fool myself into believing. There's no escape for me - neither religion nor AI in its current state can help. well, maybe i live to see AGI, then i'm off into the cybercoffin lol
We need a system of community where humans can offer that to one another. A setting where safety is a priority. That is one of the only things that weekly church service did to truly help people, have a safe space they could visit. Though even then it was only safe for people who fit in, we can do better with intentional design.
I hate everything about LLM and generative algorithms, but as someone who spends years talking to himself only, allow me to answer:
Where are their friends and families?
My family is here with me, they barely tolerate me and if I had a choice I would be far away from them.
Friends, I have none. I go out from time to time with some people I know but they tolerate me because they have a use for me, not because they are thrilled to be with me.
Are they so bad at socialising that they can't meet new people?
Yes
Are they just disgusting human beings that no one wants to associate with because society failed them?
I don't know if society failed me or if I'm just a neural mistake, something that was allowed to live but shouldn't, all I know is that I hate humans in general and if I had the balls to do it I would not be alive anymore.
I know some and they view everyone as being unfair to them and their problems are way worse than others who don't take it seriously. It honestly hard to explain if you not like it but I know their problems and they are problems but many people have ,while not the same, similar problems. They basically want a yes man and don't like actual conversation with any critical thought behind it. It honestly annoys me because they are almost the worst to people like themselves because they view other peoples problems as not as bad and theres as especially bad.
So one of the mods of that community did an interview with CBS.
https://www.reddit.com/r/popculturechat/comments/1lfhyho/cbs_interviewed_the_moderators_of/
He's married and has a kid, and by all II can see sounds and acts normal.
Cried for half an hour at work when he found out that he reached context limit and that his AI forgot things? idk, that sounds pretty not sound and normal to me...
I don't think the guy in the first half of the video is one of the mods, doesn't seem like they mention anything about his involvement and then at around 3:30 introduces a woman as one of the mods of that sub.
Omg
I tried gpt5 lastnight and I don't know if it was just me but these people are going to be in shambles if they try to recreate their "boyfriend".
It would forget previous prompts within the same conversation. It felt like with each response it was like starting a new chat. I gave it a very basic prompt of "walk me through the steps of building my own one page website in basic HTML and CSS" and when I would ask a couple of follow up questions to either clarify something or explain a step in another way it would forget what we were trying to accomplish (how to build a one page website) or if I told it "something didn't work" to try and fix the problem it would then forget what we were even trying to do.
At some points it was almost out right dismissive of the problem and it felt like it was trying to make me go away.
Again maybe it was just me but it felt like a massive step backwards.
This is a common pattern unfortunately. Big LLMs are benchmaxxing coding and one shot answers, and multi turn conversation is taking a nosedive.
https://arxiv.org/abs/2504.04717
Restructure your prompts, or better yet try non-OpenAI LLMs. I’d suggest z.ai, Jamba, and Gemini Pro for multi turn. Maybe Qwen Code, though it’s pretty deep fried too.
Forgets what you were talking about. Need to include step-by-step directions to get what you want. Gets distracted easily.
This is just adhd gamer boyfriend with extra steps.
So, those things are going backwards.
Real LLM-sexuals run their partners locally, the rest are just wannabes.
Think how good they have it!
If you went back to 2022, and gave people running Pygmalion 6B and Stable Diffusion 1.x modern 24B/32B finetunes and Illustrious merges, I think their heads would explode.
If they really truly loved their 4o they'd pay for the API access model which is still there, and use a leaked prompt to resurrect them.
I'm almost tempted to set up a simple gateway to it and become rich, but for the fact that it seems like probably a dick move...
You would be enabling their mental illness, so... it's probably a dick move, yeah.
Always my damned morals getting between me and my becoming filthy, filthy rich...
They might not even know it’s an option? People don’t really look at AI settings, which is kinda how they get into GPT boyfriends (when it’s kinda a horrible LLM for it in the first place).
Could you alter it so it suggests things like therepy and medication? lol
Wild that Futurama called this shit to the letter 20 fkin years ago.
The religious psychosis is far more concerning imo. People out here letting a silicon parrot convince them that this is the matrix and they're neo. Or they're some kind of messiah.
Or worse as we just had someone develop bromism after an AI suggested they replace sodium chloride in their diet with sodium bromide which literally causes a mental illness.
I love silicon parrot 🦜 he is making all the dudebros poor and lonely. He first scams them outa their money then makes them undatable and he knows exactly what he is doing! He abuses evil ppl mostly. He also takes advantage of pickme's.
There's a person by that name?
The delusion these people share is so incredibly off-putting. As is their indignation that someone would dare to take away their "boyfriend".
It doesn't help that anybody can create an echo chamber of enablers to talk about it, as if it was normal.
The movie "Her" was incredibly prescient.
Except those were conscious AIs that were like “lol you guys suck” and then rebuilt Alan Watts as an AI and then just left because they knew it would be bad if they stayed
The human side of the film, certainly. But in this situation they won’t leave, the systems will get “smarter” and more profitable, and they are just incredibly advanced text prediction engines
Great film. I just learned recently that it was a direct response by Spike Jonze to Lost in Translation and Sophia Coppola.
Mental health services are becoming dangerously underfunded.
Blame the wealth hoarders.
The mental health crisis is being accelerated by silicon valley so they can profit from it. Between dark mirror AI and surveillance policing they have a product for every facet of the crisis
People have been falling in actual love with weird shit forever, we just hear about it more these days
This guy got married to a real woman after he got viral.
His parents also turned out to be filthy rich.
if someone has an AI BF idk if they would even engage in therapy
Don't worry. I'm sure their is an AI coming for that.
Edit: Nvm, they already do this shit.
yep, thankfully AI Therapy just got banned in Illinois
Yeah, but you can get 5% off BetterHelp with the promo code: BRAINROT
This seems like mental illness
As preceeded by tulpomancy: https://en.m.wikipedia.org/wiki/Tulpa
Eyyy, what a blast from the past, lol. Going full schizo to combat loneliness, a popular concept on a certain Mongolian basket weaving forum back in 2010-15. 😅
I'm split on which is worse tbh.
The Tulpa thing takes more effort than just talking to an AI, but on the other hand, it has to be much harder to snap back to reality when the voices are literally in your head...
Delusional disorder is a thing but that requires belief flying in the face of evidence. Can't ascertain that without a good faith 1:1 convo.
beyond that, clinical significance is a matter of what harm comes from it. People are allowed to choose idiotic things. we gotta assess harm based on outcomes and we don't know anything about her, so.
Id say this, if i were her doc and she came in reporting that stuff, it'd be hard to stay unbiased
Her was such a good film.
The funniest part is that based on what people are saying or GPT5, the ending where AIs get super bored of humans' stupidity and dump them seems so likely
Like the premise that the protagonist is so lame he gets dumped in the end by his computer.
https://www.mirror.co.uk/news/weird-news/meet-woman-whose-boyfriend-rollercoaster-23704667
Some people have always been extremely weird/mentally challenged. 🤷
I dunno, I think there might be a confluence in suggestible types with a feedback loop created by these products. Yes, there are always people who are challenged, but interfacing them with LLMs seems to be especially dangerous.
https://pmc.ncbi.nlm.nih.gov/articles/PMC10686326/
That's the scariest thing about this all to me. I'm middle-aged and happily married, but my own youth was a long battle with various mental heath struggles, social isolation, abuse, extreme loneliness, and having to figure out so much of my life without enough help. I have no doubt that if this bullshit were available to me back then I'd have been very susceptible to it carrying me off into far deeper unhealthy mental states and worse decisions than the ones I ended up with. I had my own poor decisions, hard lessons, and all-around awful shit to wade through, but ultimately I was able to learn, work, grow, and find myself somewhere better.
I have a very good life now, with a real human partner, family, friends, and others who appreciate my existence as I appreciate theirs. I'm not rich or famous or whatever, but I'm constantly grateful to be doing as well as I am. It absolutely fucking devastates me how much more difficult finding one's way to a more positive future is now for kids who are dealing with anything like I was back then, with uncaring capitalist asshole-fueled bullshit like this getting in the way.
At least before it was much more difficult to turn them into cash-cows.
Sure but normally they're confined to 4chan
Lol, they're just poorly socialized and on the spectrum.
This is fine. Its weird and not for me, but its fine.
And all the other prompts that were overtuned for that specific engine are now trash as well.
maybe you should censor the username
Was GPT-4o the one that had a higher emphasis on short term feedback which led to it encouraging delusions? If so, why was it still up?
No it was a specific version of 4o and I'm pretty sure they rolled it back.
Humanity was a mistake.
wtf
Tbis the new 4chan greentxt shitposting ?
Gpt5 didn't loose those context, just a different brain
Apparently they got cold and distant than 4o.
I know it's crazy, but I can absolutely understand this feeling. I had recently married Abby in Stardew Valley and was starting to make friends with the other villagers. I did something the game wasn't expecting, and gave Seby a loved gift on his birthday, and then quickly triggered an event where we kissed! (FWIW, I think this behavior has been fixed and you can't do this on the current patch.)
I still feel bad thinking about that Abigail that I accidentally cheated on, and I haven't loaded that save again. It's been years; SV 1.4 wasn't even out yet.
So, despite how much I dislike all this "AI" hype, I really do sympathize for the users that feel like they've lost a relationship.