Get the latest tech news
A Pedophile Filmed Kids At Disney World To Make AI Child Abuse Images, Cops Say
AI image generator Stable Diffusion was used to turn thousands of GoPro’d photographs of children visiting the park into child sexual abuse material that was traded online.
The man, Justin Culmo, who was arrested in mid-2023, admitted to creating thousands of illegal images of children taken at the amusement park and at least one middle school, using a version of AI model Stable Diffusion, according to federal agents who presented the case to a group of law enforcement officials in Australia earlier this month. Stanford Internet Observatory’s chief technologist David Thiel told Forbes that its original developers should have better vetted their training data for explicit imagery. Animated child pornography has long been prosecutable in the U.S. and the Justice Department’s recent comments on charging Herrera indicate it plans to take a hard line on all illicit AI-created material.
Or read this on r/technology