Get the latest tech news
Character.AI and Google sued after chatbot-obsessed teen’s death
This week, Character.ai announced new safety features.
Filed by the teen’s mother, Megan Garcia, it claims the platform for custom AI chatbots was “unreasonably dangerous” and lacked safety guardrails while being marketed to children. A few months ago, The Verge wrote about the millions of young people, including teens, who make up the bulk of its user base, interacting with bots that might pretend to be Harry Styles or a therapist. Because of the way chatbots like Character.ai generate output that depends on what the user inputs, they fall into an uncanny valley of thorny questions about user-generated content and liability that, so far, lacks clear answers.
Or read this on The Verge