Get the latest tech news
Guy Gives Himself 19th Century Psychiatric Illness After Consulting With ChatGPT. "For 3 months, he had replaced sodium chloride with sodium bromide obtained from the internet after consultation with ChatGPT."
"For 3 months, he had replaced sodium chloride with sodium bromide obtained from the internet after consultation with ChatGPT."
The case study authors found similar, saying that when they tried to recreate the situation themselves, the bot did not “inquire about why we wanted to know, as we presume a medical professional would do.” There is both anecdotal and clinical evidence that AI can be helpful in a health context. Taking the ChatGPT output at face value, the man in the study bought sodium bromide (which, aside from being a dog epilepsy drug, is also a pool cleaner and pesticide) and poisoned himself over the course of three months to the point of “paranoia and auditory and visual hallucinations.” Altman also spoke with an employee of the company and his wife, who’d been diagnosed with cancer, about how they had used ChatGPT to understand diagnostic letters, decide whether she would undergo radiation, and help her be "an active participant in her own care journey".
Or read this on r/technology