Get the latest tech news

A Pedophile Filmed Kids At Disney World To Make AI Child Abuse Images, Cops Say


AI image generator Stable Diffusion was used to turn thousands of GoPro’d photographs of children visiting the park into child sexual abuse material that was traded online.

The man, Justin Culmo, who was arrested in mid-2023, admitted to creating thousands of illegal images of children taken at the amusement park and at least one middle school, using a version of AI model Stable Diffusion, according to federal agents who presented the case to a group of law enforcement officials in Australia earlier this month. Stanford Internet Observatory’s chief technologist David Thiel told Forbes that its original developers should have better vetted their training data for explicit imagery. Animated child pornography has long been prosecutable in the U.S. and the Justice Department’s recent comments on charging Herrera indicate it plans to take a hard line on all illicit AI-created material.

Get the Android app

Or read this on r/technology

Read more on:

Photo of Cops

Cops

Photo of child abuse images

child abuse images

Photo of filmed kids

filmed kids

Related news:

News photo

Chatbots offer cops the “ultimate out” to spin police reports, expert says | Experts warn chatbots writing police reports can make serious errors

News photo

Police Chief Says Cops Have a 5th Amendment Right to Leave Body Cameras Off

News photo

Future Ford's May Detect Speeding and Report You to the Cops