Get the latest tech news
An AI Image Generator’s Exposed Database Reveals What People Really Used It For
An unsecured database used by a generative AI app revealed prompts and tens of thousands of explicit images—some of which are likely illegal. The company deleted its websites after WIRED reached out.
(Researchers, victims advocates, journalists, tech companies, and more have largely phased out the phrase “child pornography,” in favor of CSAM, over the last decade). As generative AI systems have vastly enhanced how easy it is to create and modify images in the past two years, there has been an explosion of AI-generated CSAM. “Webpages containing AI-generated child sexual abuse material have more than quadrupled since 2023, and the photorealism of this horrific content has also leapt in sophistication, says Derek Ray-Hill, the interim CEO of the Internet Watch Foundation (IWF), a UK-based nonprofit that tackles online CSAM.
Or read this on Wired