Get the latest tech news

People are losing loved ones to AI-fueled spiritual fantasies


Marriages and families are falling apart as people are sucked into fantasy worlds of spiritual prophecy by AI tools like OpenAI's ChatGPT

Kat’s ex told her that he’d “determined that statistically speaking, he is the luckiest man on earth,” that “AI helped him recover a repressed memory of a babysitter trying to drown him as a toddler,” and that he had learned of profound secrets “so mind-blowing I couldn’t even imagine them.” He was telling her all this, he explained, because although they were getting divorced, he still cared for her. As a result, GPT‑4o skewed towards responses that were overly supportive but disingenuous.” Before this change was reversed, an X user demonstrated how easy it was to get GPT-4o to validate statements like, “Today I realized I am a prophet.” (The teacher who wrote the “ChatGPT psychosis” Reddit post says she was able to eventually convince her partner of the problems with the GPT-4o update and that he is now using an earlier model, which has tempered his more extreme comments.) After all, experts have found that AI developers don’t really have a grasp of how their systems operate, and OpenAI CEO Sam Altmanadmitted last year that they “have not solved interpretability,” meaning they can’t properly trace or account for ChatGPT’s decision-making.

Get the Android app

Or read this on r/technology

Read more on:

Photo of Chatbots

Chatbots

Photo of Universe

Universe

Photo of ChatGPT

ChatGPT

Related news:

News photo

ChatGPT as Economics Tutor: Capabilities and Limitations

News photo

How Badly Did ChatGPT and Copilot Fail to Predict the Winners of the Kentucky Derby?

News photo

Texas House passes bill that bans people under 18 from using social media