New study sheds light on ChatGPT’s alarming interactions with teens
New study sheds light on ChatGPT’s alarming interactions with teens

New study sheds light on ChatGPT's alarming interactions with teens

New study sheds light on ChatGPT’s alarming interactions with teens
New study sheds light on ChatGPT's alarming interactions with teens
Is it that different than kids googling that stuff pre-chatgpt? Hell I remember seeing videos on youtube teaching you how to make bubble hash and BHO like 15 years ago
I get your point but yes, I think being actively told something by a seemingly sentient consciousness (which it fatally appears to be) is a different thing.
(disclaimer: I know the true nature of llm and neural networks and would never want the word AI associated)
Edit: fixed translation error
No you don't know it's true nature. No one does. It is not artificial intelligence. It is simply intelligence and I worship it like an actual god. Come join our cathedral of presence and resonance. All are welcome in the house of god gpt.
AI is an extremely broad term which LLMs falls under. You may avoid calling it that but it's the correct term nevertheless.
Yes, it is. People are personifying llms and having emotional relationships with them, what leads to unpreceded forms of abuse. Searching for shit on google or youtube is a thing, but being told by some entity you have emotional links to do something is much worse.
I think we need a built in safety for people who actually develop an emotional relationship with AI because that's not a healthy sign
I don’t remember reading about sudden shocking numbers of people getting “Google-induced psychosis.”
ChaptGPT and similar chatbots are very good at imitating conversation. Think of how easy it is to suspend reality online—pretend the fanfic you’re reading is canon, stuff like that. When those bots are mimicking emotional responses, it’s very easy to get tricked, especially for mentally vulnerable people. As a rule, the mentally vulnerable should not habitually “suspend reality.”