“ChatGPT killed my son”: Parents’ lawsuit describes suicide notes in chat logs
“ChatGPT killed my son”: Parents’ lawsuit describes suicide notes in chat logs

arstechnica.com
“ChatGPT killed my son”: Parents’ lawsuit describes suicide notes in chat logs

ChatGPT taught teen jailbreak so bot could assist in his suicide, lawsuit says.
Personal Note: For people who have suicidal thoughts, please be aware that the number of suicidal thoughts a "normal" person is supposed to have is zero. As someone who struggled with this, get help before it gets worse.
My therapist said it's okay to have what if thoughts about self harm in risky or dangerous situation and dismiss them
I am not a therapist. But as far as I know it ultimately depends on the individual what can and cannot be handled. Self harm and suicidal ideation do not have to be the same thing, though they sometimes go together. But suicidal ideation is definitely a warning sign that nobody should take lightly, though it is possible to learn to "manage" these thoughts if they persist.