Get the latest tech news

Should we fear an attack of the voice clones?


Audio deepfakes are easy to make, hard to detect, and getting more convincing, experts say.

And while its strong suit was conversation, not impersonation, another system Mr Jacob demonstrated produced credible copies of voices, based on only small snippets of audio pulled from YouTube. With major elections in the UK, US and India due this year, there are also concerns audio deepfakes - the name for the kind of sophisticated fake voices AI can create - could be used to generate misinformation aimed at manipulating the democratic outcomes. Ms Martinez, who had a stint at Twitter tackling misinformation, argues that in a year when over half the world's population will head to the polls, social media firms must do more and should strengthen teams fighting disinformation.

Get the Android app

Or read this on BBC News

Read more on:

Photo of attack

attack

Photo of voice clones

voice clones

Related news:

News photo

Binance Sued by Hamas Hostage, Families of Victims in Attack

News photo

Ivanti Patches Two Zero-Days Under Attack, But Finds Another

News photo

Ivanti patches two zero-days under attack, but finds another