Get the latest tech news

UK watchdog accuses Apple of failing to report sexual images of children


Exclusive: NSPCC finds Apple implicated in more cases of predators sharing child abuse imagery in England and Wales alone than the company reported globally in a year

Apple is failing to effectively monitor its platforms or scan for images and videos of the sexual abuse of children, child safety experts allege, which is raising concerns about how the company can handle growth in the volume of such material associated with artificial intelligence. In a year, child predators used Apple’s iCloud, iMessage and Facetime to store and exchange CSAM in a higher number of cases in England and Wales alone than the company reported across all other countries combined, according to police data obtained by the NSPCC. Through data gathered via freedom of information requests and shared exclusively with the Guardian, the children’s charity found Apple was implicated in 337 recorded offenses of child abuse images between April 2022 and March 2023 in England and Wales.

Get the Android app

Or read this on r/technology

Read more on:

Photo of Apple

Apple

Photo of Children

Children

Photo of uk watchdog

uk watchdog

Related news:

News photo

Apple TV+ Curbs Costs After Expensive Projects Fail to Capture Viewers

News photo

Apple Tries to Rein In Hollywood Spending After Years of Losses

News photo

Apple's New DAC Patent Promises Flawless Sound Quality on All Apple Devices by Eliminating One Thing