Get the latest tech news

Another lawsuit blames an AI company of complicity in a teenager's suicide


A family has filed a wrongful death lawsuit against Character AI, alleging its complicity in their teenage daughters suicide.

As originally reported by The Washington Post, the chatbot expressed empathy and loyalty to Juliana, making her feel heard while encouraging her to keep engaging with the bot. The suit asks the court to award damages to Juliana's parents and requires Character to make changes to its app to better protect minors. It alleges that the chatbot did not point Juliana toward any resources, notify her parents or report her suicide plan to authorities.

Get the Android app

Or read this on Endgadget

Read more on:

Photo of lawsuit

lawsuit

Photo of AI company

AI company

Photo of teenager

teenager

Related news:

News photo

Lawsuit Says Musk's Tesla Hires Visa Holders Instead of Americans So It Can Pay Less

News photo

Roblox, Discord sued after 15-year-old boy was allegedly groomed online before he died by suicide | Ethan Dallas was targeted by an adult sexual predator on Roblox when he was 12, and later on Discord, according to a lawsuit. He took his own life last year.

News photo

Roblox hit with wrongful death lawsuit following a teen player's suicide