Get the latest tech news

Lawsuit blames Character.AI in death of 14-year-old boy


A new lawsuit blames Character.AI, the AI role-playing platform, in the suicide of a 14-year-old Florida boy.

According to The New York Times, Sewell Setzer III, a ninth grader from Orlando, had spent months talking to chatbots on Character.AI’s AI role-playing app. Setzer developed an emotional attachment to one bot in particular, “Dany,” which he texted constantly — to the point where he began to pull away from the real world. This morning, Character.AI said it would roll out a number of new safety features, including “improved detection, response, and intervention” related to chats that violate its terms of service and a notification when a user has spent an hour in a chat.

Get the Android app

Or read this on TechCrunch

Read more on:

Photo of Year

Year

Photo of death

death

Photo of character

character

Related news:

News photo

Can a Chatbot Named Daenerys Targaryen Be Blamed for a Teen’s Suicide?

News photo

Lawsuit Argues Warrantless Use of Flock Surveillance Cameras Is Unconstitutional

News photo

Lawsuit: City cameras make it impossible to drive anywhere without being tracked | Police use of automated license-plate reader cameras is being challenged in a lawsuit alleging that the cameras enable warrantless surveillance in violation of the Fourth Amendment.