Get the latest tech news
Character.AI sued again over ‘harmful’ messages sent to teens
An underage user says its chatbots encouraged self-harm.
The suit, filed in Texas on behalf of the 17-year-old and his family, targets Character.AI and its cofounders’ former workplace, Google, with claims including negligence and defective product design. It alleges that Character.AI allowed underage users to be “ targeted with sexually explicit, violent, and otherwise harmful material, abused, groomed, and even encouraged to commit acts of violence on themselves and others.” And while Section 230 has long protected sites from being sued over third-party content, the Character.AI suits argue that chatbot service creators are liable for any harmful material the bots produce.
Or read this on The Verge