Get the latest tech news

'It cannot provide nuance': UK experts warn AI therapy chatbots are not safe


Experts say such tools may give dangerous advice and more oversight is needed, as Mark Zuckerberg says AI can plug gap

Prof Dame Til Wykes, the head of mental health and psychological sciences at King’s College London, cites the example of an eating disorder chatbot that was pulled in 2023 after giving dangerous advice. In an interview with the Stratechery newsletter, Zuckerberg, whose company owns Facebook, Instagram and WhatsApp, added that AI would not squeeze people out of your friendship circle but add to it. Dr Jaime Craig, who is about to take over as chair of the UK’s Association of Clinical Psychologists, says it is “crucial” that mental health specialists engage with AI in their field and “ensure that it is informed by best practice”.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of nuance

nuance

Photo of UK experts

UK experts

Photo of AI therapy chatbots

AI therapy chatbots

Related news:

News photo

Nuance and Nuisance: On the Village Voice

News photo

The Borderlands movie trailer has all the nuance of a Borderlands game