Get the latest tech news
Apple “clearly underreporting” child sex abuse, watchdogs say
Report: Apple vastly undercounts child sex abuse materials on iCloud and iMessage.
The United Kingdom’s National Society for the Prevention of Cruelty to Children (NSPCC) shared UK police data with The Guardian showing that Apple is "vastly undercounting how often" CSAM is found globally on its services. Sarah Gardner, the CEO of a Los Angeles-based child protection organization called the Heat Initiative, told The Guardian that she considers Apple's platforms a "black hole" obscuring CSAM. Gardner agreed with Collard that Apple is "clearly underreporting" and has "not invested in trust and safety teams to be able to handle this" as it rushes to bring sophisticated AI features to its platforms.
Or read this on r/technology