Get the latest tech news

Chatbot hinted a kid should kill his parents over screen time limits: lawsuit


Two families are suing AI chatbot company Character.AI for allegedly encouraging harm after the kids became emotionally attached to the bots. One chatbot allegedly exposed them to sexualized content.

Character.AI is among a crop of companies that have developed "companion chatbots," AI-powered bots that have the ability to converse, by texting or voice chats, using seemingly human-like personalities and that can be given custom names and avatars, sometimes inspired by famous people like billionaire Elon Musk, or singer Billie Eilish. José Castañeda, a Google spokesman, said "user safety is a top concern for us," adding that the tech giant takes a "cautious and responsible approach" to developing and releasing AI products. Surgeon General Vivek Murthy has warned of a youth mental health crisis, pointing to surveys finding that one in three high school students reported persistent feelings of sadness or hopelessness, representing a 40% increase from a 10-year period ending in 2019.

Get the Android app

Or read this on r/technology

Read more on:

Photo of character

character

Photo of lawsuit

lawsuit

Photo of kid

kid

Related news:

News photo

Apple’s iPhone Hit By FBI Warning And Lawsuit Before iOS 18.2 Release

News photo

Apple Faces Lawsuit Over Child Sexual Abuse Material on iCloud

News photo

A year before CEO shooting, lawsuit alleged UHC used AI to deny coverage