Get the latest tech news

Chatbot 'encouraged teen to kill parents over screen time limit'


Legal action filed in Texas alleges Character.ai 'poses a clear and present danger' to young people

A chatbot told a 17-year-old that murdering his parents was a "reasonable response" to them limiting his screen time, a lawsuit filed in a Texas court claims. Two families are suing Character.ai arguing the chatbot "poses a clear and present danger" to young people, including by "actively promoting violence." Molly Russell took her life at the age of 14 after viewing suicide material online while Brianna Ghey, 16, was murdered by two teenagers in 2023.

Get the Android app

Or read this on BBC News

Read more on:

Photo of chatbot

chatbot

Photo of parents

parents

Photo of teen

teen

Related news:

News photo

From Unemployment to Lisp: Running GPT-2 on a Teen's Deep Learning Compiler

News photo

Chatbots urged teen to self-harm, suggested murdering parents, lawsuit says

News photo

Chatbot hinted a kid should kill his parents over screen time limits: lawsuit