Get the latest tech news

Online child sex abuse material, boosted by AI, is outpacing Big Techs regulation


Watchdogs say its a "stark vision of the future."

Another UK watchdog report, published in the Guardian today, alleges that Apple is vastly underreporting the amount of child sexual abuse materials shared via its products, prompting concern over how the company will manage content made with generative AI. While Apple made 267 worldwide reports of CSAM to the National Center for Missing and Exploited Children (NCMEC) in 2023, the NSPCC alleges that the company was implicated in 337 offenses of child abuse images in just England and Wales, alone — and those numbers were just for the period between April 2022 and March 2023. Apple declined the Guardian's request for comment, pointing the publication to a previous company decision to not scan iCloud photo libraries for CSAM, in an effort to prioritize user security and privacy.

Get the Android app

Or read this on Mashable

Read more on:

Photo of big tech

big tech

Photo of regulation

regulation

Related news:

News photo

Big Tech withholds its products from the EU in response to regulation

News photo

Self-Regulation Won’t Prevent Problematic Political Uses of Generative AI

News photo

J.D. Vance’s A.I. Agenda: Reduce Regulation