Get the latest tech news

What to know about an AI transcription tool that ‘hallucinates’ medical interactions


Many medical centers use an AI-powered tool called Whisper to transcribe patients’ interactions with their doctors. But researchers have found that it sometimes invents text, a phenomenon known in the industry as hallucinations, raising the possibility of errors like misdiagnosis. John Yang speaks with Associated Press global investigative reporter Garance Burke to learn more.

Jan 25, 2025 5:35 PM ESTMany medical centers use an AI-powered tool called Whisper to transcribe patients’ interactions with their doctors. Yeah, so in talking with more than a dozen engineers and academic researchers, my co-reporter Hilka Shellman and I found that this particular AI powered transcription tool makes things up that can include racial commentary, sometimes even violent rhetoric. You know, we spoke to one person who said that she decided to opt out of having her daughter's doctor's visit recorded simply because she had concerns about the privacy of their family's intimate medical history being shared with a big tech company.

Get the Android app

Or read this on r/technology

Read more on:

Photo of hallucinates

hallucinates

Photo of medical interactions

medical interactions