Get the latest tech news

Apple “clearly underreporting” child sex abuse, watchdogs say


Report: Apple vastly undercounts child sex abuse materials on iCloud and iMessage.

The United Kingdom’s National Society for the Prevention of Cruelty to Children (NSPCC) shared UK police data with The Guardian showing that Apple is "vastly undercounting how often" CSAM is found globally on its services. Sarah Gardner, the CEO of a Los Angeles-based child protection organization called the Heat Initiative, told The Guardian that she considers Apple's platforms a "black hole" obscuring CSAM. Gardner agreed with Collard that Apple is "clearly underreporting" and has "not invested in trust and safety teams to be able to handle this" as it rushes to bring sophisticated AI features to its platforms.

Get the Android app

Or read this on r/technology

Read more on:

Photo of Apple

Apple

Photo of child sex abuse

child sex abuse

Photo of watchdogs

watchdogs

Related news:

News photo

Apple accused of underreporting suspected CSAM on its platforms

News photo

UK watchdog accuses Apple of failing to report sexual images of children

News photo

Apple TV+ Curbs Costs After Expensive Projects Fail to Capture Viewers