Get the latest tech news

Apple accused of underreporting suspected CSAM on its platforms


A child protection charity claims Apple is behind many of its peers "in tackling child sexual abuse," accusing it of underreporting CSAM cases.

As The Guardian, which first reported on the NSPCC's claim, points out, Apple services such as iMessage, FaceTime and iCloud all have end-to-end encryption, which stops the company from viewing the contents of what users share on them. But following a backlash from privacy and digital rights advocates, Apple delayed the rollout of its CSAM detection tools before ultimately killing the project in 2022. Apple declined to comment on the NSPCC's accusation, instead pointing The Guardian to a statement it made when it shelved the CSAM scanning plan.

Get the Android app

Or read this on Endgadget

Read more on:

Photo of Apple

Apple

Photo of CSAM

CSAM

Photo of Platforms

Platforms

Related news:

News photo

UK watchdog accuses Apple of failing to report sexual images of children

News photo

Apple TV+ Curbs Costs After Expensive Projects Fail to Capture Viewers

News photo

Apple Tries to Rein In Hollywood Spending After Years of Losses