Get the latest tech news
OpenAI’s Transcription Tool Hallucinates. Hospitals Are Using It Anyway
In health care settings, it’s important to be precise. That’s why the widespread use of OpenAI’s Whisper transcription tool among medical workers has experts alarmed.
On Saturday, an Associated Press investigation revealed that OpenAI's Whisper transcription tool creates fabricated text in medical and business settings despite warnings against such use. The AP interviewed more than 12 software engineers, developers, and researchers who found the model regularly invents text that speakers never said, a phenomenon often called a “ confabulation ” or “hallucination” in the AI field. Nabla acknowledges that Whisper can confabulate, but it also reportedly erases original audio recordings “for data safety reasons.” This could cause additional issues, since doctors cannot verify accuracy against the source material.
Or read this on Wired