Get the latest tech news

Apple sued over abandoning CSAM detection for iCloud


Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit

Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit argues that by not doing more to prevent the spread of this material, it’s forcing victims to relive their trauma, according to The New York Times. Attorney James Marsh, who is involved with the lawsuit, said there’s a potential group of 2,680 victims who could be entitled to compensation in this case.

Get the Android app

Or read this on TechCrunch

Read more on:

Photo of Apple

Apple

Photo of icloud

icloud

Photo of CSAM detection

CSAM detection

Related news:

News photo

Woot Expands Apple Watch Band Sale With Even More Solo/Braided Loops at Massive Discounts

News photo

Apple Faces Lawsuit Over Child Sexual Abuse Material on iCloud

News photo

Apple and Sony are working on Vision Pro support for PSVR 2 controllers