Get the latest tech news

AI driving 'explosion' of fake nudes as victims say the law is failing them


There's been a huge rise in sexually explicit deepfakes as software to digitally transform a clothed picture into a naked picture is easy to get hold of, according to online harm experts, but for victims, the effect can be devastating.

Ofcom will later this month introduce codes of practice for internet companies to clamp down on the illegal distribution of fake nudes, but Sky News has met two victims of this relatively new trend, who say the law needs to go further. Earlier this year, social media influencer and former Love Island contestant, Cally Jane Beech, 33, was horrified when she discovered someone had used AI to turn an underwear brand photograph of her into a nude and it was being shared online. Alex Davies-Jones, under-secretary of state for victims, told MPs in November: "We've committed to making an offence of creating a deepfake illegal and we will be legislating for that this session."

Get the Android app

Or read this on r/technology

Read more on:

Photo of law

law

Photo of victims

victims

Photo of explosion

explosion

Related news:

News photo

Federal appeals court upholds law requiring sale or ban of TikTok in the U.S.

News photo

Appeals court upholds TikTok ban, declining to block law that would force sale

News photo

TikTok divest-or-ban law upheld by federal court