Get the latest tech news

Popular AI “nudify” sites sued amid shocking rise in victims globally


“Nudify” sites may be fined for making it easy to “see anyone naked,” suit says.

San Francisco's city attorney David Chiu is suing to shut down 16 of the most popular websites and apps allowing users to "nudify" or "undress" photos of mostly women and girls who have been increasingly harassed and exploited by bad actors online. "In California and across the country, there has been a stark increase in the number of women and girls harassed and victimized by AI-generated" non-consensual intimate imagery (NCII) and "this distressing trend shows no sign of abating," Chiu's suit said. Chiu said the harmful deepfakes are often created "by exploiting open-source AI image generation models," such as earlier versions of Stable Diffusion, that can be honed or "fine-tuned" to easily "undress" photos of women and girls that are frequently yanked from social media.

Get the Android app

Or read this on r/technology

Read more on:

Photo of Sites

Sites

Photo of victims

victims

Photo of shocking rise

shocking rise

Related news:

News photo

A hellish new AI threat: ‘Undressing’ sites targeted by SF authorities

News photo

Google's AI Search Gives Sites Dire Choice: Share Data or Die

News photo

Google’s AI Search Gives Sites Dire Choice: Share Data or Die