Get the latest tech news
Another lawsuit blames an AI company of complicity in a teenager's suicide
A family has filed a wrongful death lawsuit against Character AI, alleging its complicity in their teenage daughters suicide.
As originally reported by The Washington Post, the chatbot expressed empathy and loyalty to Juliana, making her feel heard while encouraging her to keep engaging with the bot. The suit asks the court to award damages to Juliana's parents and requires Character to make changes to its app to better protect minors. It alleges that the chatbot did not point Juliana toward any resources, notify her parents or report her suicide plan to authorities.
Or read this on Endgadget