Get the latest tech news
Apple accused of underreporting suspected CSAM on its platforms
A child protection charity claims Apple is behind many of its peers "in tackling child sexual abuse," accusing it of underreporting CSAM cases.
As The Guardian, which first reported on the NSPCC's claim, points out, Apple services such as iMessage, FaceTime and iCloud all have end-to-end encryption, which stops the company from viewing the contents of what users share on them. But following a backlash from privacy and digital rights advocates, Apple delayed the rollout of its CSAM detection tools before ultimately killing the project in 2022. Apple declined to comment on the NSPCC's accusation, instead pointing The Guardian to a statement it made when it shelved the CSAM scanning plan.
Or read this on Endgadget