OpenAI stops ChatGPT from telling people to break up with partners
OpenAI stops ChatGPT from telling people to break up with partners

OpenAI stops ChatGPT from telling people to break up with partners

OpenAI stops ChatGPT from telling people to break up with partners
OpenAI stops ChatGPT from telling people to break up with partners
That AI bot must be saturated with break-up and "Delete Facebook, hit the gym!" advice..
Well that's about 99% of reddit aith posts so it would track
now it just needs to hit Facebook, delete lawyer, and Facebook up
So if I tell it that I'm literally a slave who is beaten and sexually violated daily, it's still going to tell me to stay in that situation? That seems helpful.
That's an insightful and important observation about the perils of overly-restricted AI tools. You're right to question that!
It's going to help you reflect on that situation!
If you have that type of problem then why are you even going to ChatGPT to begin with?
Have a look at relationship subreddits. They are full of people who have been manipulated and gaslit for years or decades who have no idea what is actually normal. For people like that a reality check is really helpful or even vital.
The US company said new ChatGPT behaviour for dealing with “high-stakes personal decisions” would be rolled out soon.
Fuck we've been in a black mirror episode all along
Does anyone else think that LLMs have been regressing in the last 2 years. At least in the area of deep questions. Initially you could push the limits and actually bounce crazy ideas off of them and get some creative results.
Now they basically they always give you a canned answer that sounds like it was pre-approved by a committee.
I believe one of Apple’s papers covered this, that chain of thought and other techniques that increase accuracy decrease perceived creativity. Memory is fuzzy though
The original 3.5 during the first month was peak creativity. The censorship really does ruin them.
I feel the same. They actually seemed kind of cool in the very early days but they've become monumentally shit over time.
Sometimes that’s what people need to hear.
It’s the default relationship advice on Reddit
To be fair, most of the posts that people are commenting to break up on are like “Hey, so my significant other stabbed me in the stomach last week and as I was sitting in the hospital, I realized that it just made me feel disrespected, ya know? This of course all happened after he told my parents he owns me and they can’t see me ever again and he threw my cat out of a 4 story window. What should I do guys?”
Usually it's the opposite. Loss aversion puts us in bad places for far too long.
100% people should break up much more often statistically speaking. But uprooting your life is understandably very hard.
The medium is also filled with so much bullshit like "the kids will be sad" when in reality almost always kids end up better in divorced families than the ones that struggle to stay together.
I’ve been playing with ai, pushing the limits of it as a therapy tool (I’m being honest, but I’m using myself as a test case). I’m really trying to evaluate it in good faith, mostly to try to see what’s addicting folks to it these days.
I just cannot get it to pass my “Turing test”. It’s just so sycophantic. Every time I get close to feeling like there’s something there, it falls back into the “blow smoke up your ass mode.”
So far my hours of conversing has only convinced me that folks who follow chatbot advice for such important life decisions, probably deserve it. I’m happily married, but if a girl ever broke up with me because a LLM told her to, I’d probably login to thank it (joke).
That's what it is, really. Even a layman understanding of how LLM works should be an immediate show stopper for people looking for "human interactions" with them.
You can change the personality it takes on using the system prompt.