Get the latest tech news
Should we fear an attack of the voice clones?
Audio deepfakes are easy to make, hard to detect, and getting more convincing, experts say.
And while its strong suit was conversation, not impersonation, another system Mr Jacob demonstrated produced credible copies of voices, based on only small snippets of audio pulled from YouTube. With major elections in the UK, US and India due this year, there are also concerns audio deepfakes - the name for the kind of sophisticated fake voices AI can create - could be used to generate misinformation aimed at manipulating the democratic outcomes. Ms Martinez, who had a stint at Twitter tackling misinformation, argues that in a year when over half the world's population will head to the polls, social media firms must do more and should strengthen teams fighting disinformation.
Or read this on BBC News