Get the latest tech news
Apple sued over abandoning CSAM detection for iCloud
Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit
Apple is being sued over its decision not to implement a system that would have scanned iCloud photos for child sexual abuse material (CSAM). The lawsuit argues that by not doing more to prevent the spread of this material, it’s forcing victims to relive their trauma, according to The New York Times. Attorney James Marsh, who is involved with the lawsuit, said there’s a potential group of 2,680 victims who could be entitled to compensation in this case.
Or read this on TechCrunch