Get the latest tech news

Researchers say an AI-powered transcription tool used in hospitals invents things no one ever said


Whispher is a popular transcription tool powered by artificial intelligence but it has a major flaw. It makes things up that were never said.

Experts said that such fabrications are problematic because Whisper is being used in a slew of industries worldwide to translate and transcribe interviews, generate text in popular consumer technologies and create subtitles for videos. The tool is integrated into some versions of OpenAI’s flagship chatbot ChatGPT, and is a built-in offering in Oracle and Microsoft’s cloud computing platforms, which service thousands of companies worldwide. A California state lawmaker, Rebecca Bauer-Kahan, said she took one of her children to the doctor earlier this year, and refused to sign a form the health network provided that sought her permission to share the consultation audio with vendors that included Microsoft Azure, the cloud computing system run by OpenAI’s largest investor.

Get the Android app

Or read this on r/technology

Read more on:

Photo of things

things

Photo of researchers

researchers

Photo of hospitals

hospitals

Related news:

News photo

Researchers Develop New Lithium Extraction Method With 'Nearly Double the Performance'

News photo

Apple’s Sales in China Are Stalling. What Will It Sacrifice to Turn Things Around?

News photo

Boeing is still bleeding money on the Starliner commercial crew program | "We signed up to some things that are problematic."