Get the latest tech news
Family of dead teen say ChatGPT's new parental controls not enough
It's one of a number of parental controls being introduced by OpenAI, which is being sued over the death of a teenager in the US.
When news of the lawsuit emerged last week, OpenAI published a note on its website stating ChatGPT is trained to direct people to seek professional help when they are in trouble, such as the Samaritans in the UK. The company stated that it is working with a group of specialists in youth development, mental health and "human-computer interaction" to help shape an "evidence-based vision for how AI can support people's well-being and help them thrive". Earlier this week, Meta - who operate Facebook and Instagram - said it would introduce more guardrails to its artificial intelligence (AI) chatbots - including blocking them from talking to teens about suicide, self-harm and eating disorders.
Or read this on BBC News