Read news on teen jailbreak with our app.
Read more in the app
“ChatGPT killed my son”: Parents’ lawsuit describes suicide notes in chat logs | ChatGPT taught teen jailbreak so bot could assist in his suicide, lawsuit says.