Get the latest tech news

OpenAI’s Transcription Tool Hallucinates. Hospitals Are Using It Anyway


In health care settings, it’s important to be precise. That’s why the widespread use of OpenAI’s Whisper transcription tool among medical workers has experts alarmed.

On Saturday, an Associated Press investigation revealed that OpenAI's Whisper transcription tool creates fabricated text in medical and business settings despite warnings against such use. The AP interviewed more than 12 software engineers, developers, and researchers who found the model regularly invents text that speakers never said, a phenomenon often called a “ confabulation ” or “hallucination” in the AI field. Nabla acknowledges that Whisper can confabulate, but it also reportedly erases original audio recordings “for data safety reasons.” This could cause additional issues, since doctors cannot verify accuracy against the source material.

Get the Android app

Or read this on Wired

Read more on:

Photo of OpenAI

OpenAI

Photo of hospitals

hospitals

Related news:

News photo

OpenAI reportedly asks Broadcom for help with custom inferencing silicon

News photo

OpenAI, Broadcom Working to Develop AI Inference Chip

News photo

OpenAI reportedly is making its first AI chip with TSMC and Broadcom