Get the latest tech news
Lawsuit blames Character.AI in death of 14-year-old boy
A new lawsuit blames Character.AI, the AI role-playing platform, in the suicide of a 14-year-old Florida boy.
According to The New York Times, Sewell Setzer III, a ninth grader from Orlando, had spent months talking to chatbots on Character.AI’s AI role-playing app. Setzer developed an emotional attachment to one bot in particular, “Dany,” which he texted constantly — to the point where he began to pull away from the real world. This morning, Character.AI said it would roll out a number of new safety features, including “improved detection, response, and intervention” related to chats that violate its terms of service and a notification when a user has spent an hour in a chat.
Or read this on TechCrunch