Get the latest tech news

Character.AI sued again over ‘harmful’ messages sent to teens


An underage user says its chatbots encouraged self-harm.

The suit, filed in Texas on behalf of the 17-year-old and his family, targets Character.AI and its cofounders’ former workplace, Google, with claims including negligence and defective product design. It alleges that Character.AI allowed underage users to be “ targeted with sexually explicit, violent, and otherwise harmful material, abused, groomed, and even encouraged to commit acts of violence on themselves and others.” And while Section 230 has long protected sites from being sued over third-party content, the Character.AI suits argue that chatbot service creators are liable for any harmful material the bots produce.

Get the Android app

Or read this on The Verge

Read more on:

Photo of character

character

Photo of messages

messages

Photo of teens

teens

Related news:

News photo

Chatbot hinted a kid should kill his parents over screen time limits: lawsuit

News photo

Uber will need to fingerprint drivers in California to transport teens

News photo

Fallout season two starts filming with Walton Goggins back in character