Get the latest tech news
AI is overpowering efforts to catch child predators, experts warn | Safety groups say images are so lifelike that it can be hard to see if real children were subject to harms in production
Safety groups say images are so lifelike that it can be hard to see if real children were subject to harms in production
The volume of sexually explicit images of children being generated by predators using artificial intelligence is overwhelming law enforcement’s capabilities to identify and rescue real-life victims, child safety experts warn. When a known image of child sexual abuse is uploaded, tech companies that are running software to monitor this activity have the capabilities to intercept and block them based on their hash value and report the user to law enforcement. “If major companies are unwilling to do the basics with CSAM detection, why would we think they would take all these extra steps in this AI world without regulation?” said Sarah Gardner, CEO of the Heat Initiative, a Los Angeles-based child safety group.
Or read this on r/technology