Get the latest tech news

Apple hit with $1.2B lawsuit after killing controversial CSAM-detecting tool. Apple knowingly ignoring child porn is a "never-ending nightmare," lawsuit says.


Apple knowingly ignoring child porn is a “never-ending nightmare,” lawsuit says.

Some survivors are in their late 20s now but were victimized when they were only infants or toddlers and have been traumatized by ongoing crime notifications that they received for decades, including some showing that images of their abuse have been found on Apple devices and services. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. When Apple devices are used to spread CSAM, it's a huge problem for survivors, who allegedly face a range of harms, including "exposure to predators, sexual exploitation, dissociative behavior, withdrawal symptoms, social isolation, damage to body image and self-worth, increased risky behavior, and profound mental health issues, including but not limited to depression, anxiety, suicidal ideation, self-harm, insomnia, eating disorders, death, and other harmful effects."

Get the Android app

Or read this on r/technology

Read more on:

Photo of Apple

Apple

Photo of controversial CSAM

controversial CSAM

Photo of child porn

child porn

Related news:

News photo

Apple Shares Are on a Tear Despite Sluggish Growth, Tariff Risks

News photo

Apple's Pro Display XDR Is Five Years Old Today

News photo

Apple Seeds Second Release Candidate Versions of iOS 18.2 and More With Genmoji, Image Playground and ChatGPT Integration