Get the latest tech news
Online child sex abuse material, boosted by AI, is outpacing Big Techs regulation
Watchdogs say its a "stark vision of the future."
Another UK watchdog report, published in the Guardian today, alleges that Apple is vastly underreporting the amount of child sexual abuse materials shared via its products, prompting concern over how the company will manage content made with generative AI. While Apple made 267 worldwide reports of CSAM to the National Center for Missing and Exploited Children (NCMEC) in 2023, the NSPCC alleges that the company was implicated in 337 offenses of child abuse images in just England and Wales, alone — and those numbers were just for the period between April 2022 and March 2023. Apple declined the Guardian's request for comment, pointing the publication to a previous company decision to not scan iCloud photo libraries for CSAM, in an effort to prioritize user security and privacy.
Or read this on Mashable