Get the latest tech news

Meta's platforms showed hundreds of "nudify" deepfake ads, CBS News investigation finds


Meta platforms such as Instagram have marketed AI tools that let users create sexually explicit images of real people.

"We have strict rules against non-consensual intimate imagery; we removed these ads, deleted the Pages responsible for running them and permanently blocked the URLs associated with these apps," a Meta spokesperson told CBS News in an emailed statement. Alexios Mantzarlis, director of the Security, Trust, and Safety Initiative at Cornell University's tech research center, has been studying the surge in AI deepfake networks marketing on social platforms for more than a year. A CBS News analysis of one "nudify" website promoted on Instagram showed that the site did not prompt any form of age verification prior to a user uploading a photo to generate a deepfake image.

Get the Android app

Or read this on r/technology

Read more on:

Photo of Meta

Meta

Photo of Hundreds

Hundreds

Photo of Platforms

Platforms

Related news:

News photo

Meta: Shut down your invasive AI Discover feed

News photo

The Oversight Board says Meta isn't doing enough to fight celeb deepfake scams

News photo

How much information do LLMs really memorize? Now we know, thanks to Meta, Google, Nvidia and Cornell