Get the latest tech news
Apple sued for failing to implement tools that would detect CSAM in iCloud
A lawsuit has been filed on behalf of a potential group of 2,680 victims, The New York Times reports.
It claims that, after Apple showed off its planned child safety tools, the company “failed to implement those designs or take any measures to detect and limit” CSAM on its devices, leading to the victims’ harm as the images continued to circulate. In a statement to The New York Times about the lawsuit, Apple spokesperson Fred Sainz said, “Child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users.” The lawsuit comes just a few months after Apple was accused of underreporting CSAM by the UK’s National Society for the Prevention of Cruelty to Children (NSPCC).
Or read this on Endgadget