Get the latest tech news

OpenAI’s Whisper transcription tool has hallucination issues, researchers say


Software engineers, developers, and academic researchers have serious concerns about transcriptions from OpenAI’s Whisper, according to a report in the

Software engineers, developers, and academic researchers have serious concerns about transcriptions from OpenAI’s Whisper, according to a report in the Associated Press. Instead researchers told the AP that Whisper has introduced everything from racial commentary to imagined medical treatments into transcripts. An OpenAI spokesperson said the company is “continually working to improve the accuracy of our models, including reducing hallucinations” and noted that its usage policies prohibit using Whisper “in certain high-stakes decision-making contexts.”

Get the Android app

Or read this on TechCrunch

Read more on:

Photo of OpenAI

OpenAI

Photo of researchers

researchers

Photo of hallucination issues

hallucination issues

Related news:

News photo

OpenAI denies it’s releasing a model called ‘Orion’ this year

News photo

Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said

News photo

Researchers Develop New Lithium Extraction Method With 'Nearly Double the Performance'