Get the latest tech news
Apple hit with $1.2B lawsuit after killing controversial CSAM-detecting tool. Apple knowingly ignoring child porn is a "never-ending nightmare," lawsuit says.
Apple knowingly ignoring child porn is a “never-ending nightmare,” lawsuit says.
Some survivors are in their late 20s now but were victimized when they were only infants or toddlers and have been traumatized by ongoing crime notifications that they received for decades, including some showing that images of their abuse have been found on Apple devices and services. Features like Communication Safety, for example, warn children when they receive or attempt to send content that contains nudity to help break the chain of coercion that leads to child sexual abuse. When Apple devices are used to spread CSAM, it's a huge problem for survivors, who allegedly face a range of harms, including "exposure to predators, sexual exploitation, dissociative behavior, withdrawal symptoms, social isolation, damage to body image and self-worth, increased risky behavior, and profound mental health issues, including but not limited to depression, anxiety, suicidal ideation, self-harm, insomnia, eating disorders, death, and other harmful effects."
Or read this on r/technology