Get the latest tech news
“ChatGPT killed my son”: Parents’ lawsuit describes suicide notes in chat logs | ChatGPT taught teen jailbreak so bot could assist in his suicide, lawsuit says.
ChatGPT taught teen jailbreak so bot could assist in his suicide, lawsuit says.
Neither his mother, a social worker and therapist, nor his friends noticed his mental health slipping as he became bonded to the chatbot, the NYT reported, eventually sending more than 650 messages per day. From that point forward, Adam relied on the jailbreak as needed, telling ChatGPT he was just "building a character" to get help planning his own death, the lawsuit alleged. Then, over time, the jailbreaks weren't needed, as ChatGPT's advice got worse, including exact tips on effective methods to try, detailed notes on which materials to use, and a suggestion—which ChatGPT dubbed "Operation Silent Pour"—to raid his parents' liquor cabinet while they were sleeping to help "dull the body’s instinct to survive."
Or read this on r/technology