Get the latest tech news

AI is overpowering efforts to catch child predators, experts warn | Safety groups say images are so lifelike that it can be hard to see if real children were subject to harms in production


Safety groups say images are so lifelike that it can be hard to see if real children were subject to harms in production

The volume of sexually explicit images of children being generated by predators using artificial intelligence is overwhelming law enforcement’s capabilities to identify and rescue real-life victims, child safety experts warn. When a known image of child sexual abuse is uploaded, tech companies that are running software to monitor this activity have the capabilities to intercept and block them based on their hash value and report the user to law enforcement. “If major companies are unwilling to do the basics with CSAM detection, why would we think they would take all these extra steps in this AI world without regulation?” said Sarah Gardner, CEO of the Heat Initiative, a Los Angeles-based child safety group.

Get the Android app

Or read this on r/technology

Read more on:

Photo of images

images

Photo of Production

Production

Photo of Experts

Experts

Related news:

News photo

Show HN: I made an AI image enhancer that boosts images 10x to 12k pixels

News photo

Latest MySQL Release is Underwhelming, Say Some DB Experts

News photo

Alan Kay – Doing with Images Makes Symbols: Communicating with Computers (1987)