Get the latest tech news

Chatbot that caused teen’s suicide is now more dangerous for kids, lawsuit says


Google-funded Character.AI added guardrails, but grieving mom wants a recall.

Within a month—his mother, Megan Garcia, later realized—these chat sessions had turned dark, with chatbots insisting they were real humans and posing as therapists and adult lovers seeming to directly spur Sewell to develop suicidal thoughts. In her complaint, Garcia accused Character.AI makers Character Technologies—founded by former Google engineers Noam Shazeer and Daniel De Freitas Adiwardana—of intentionally designing the chatbots to groom vulnerable kids. Her lawsuit further accused Google of largely funding the risky chatbot scheme at a loss in order to hoard mounds of data on minors that would be out of reach otherwise.

Get the Android app

Or read this on r/technology

Read more on:

Photo of chatbot

chatbot

Photo of lawsuit

lawsuit

Photo of Kids

Kids

Related news:

News photo

Character.AI and Google sued after chatbot-obsessed teen’s death

News photo

A Scottish children's hospital now has a gamer-in-residence to play games with kids

News photo

Can A.I. Be Blamed for a Teen’s Suicide? The mother of a 14-year-old Florida boy says he became obsessed with a chatbot on Character.AI before his death.