Get the latest tech news
Effective CSAM filters are impossible because what CSAM is depends on context
Automatically tagging or filtering child sexual exploitation materials (CSAM) cannot be effective and preserve privacy at the same time, regardless of what kind of tech one throws at it. Because what
Automatically tagging or filtering child sexual exploitation materials (CSAM) cannot be effective and preserve privacy at the same time, regardless of what kind of tech one throws at it. Literally the same photo, bit-by-bit identical, can be an innocent memorabilia when sent between family members, and a case of CSAM if shared on a child porn group. If politicians wanted to be serious about solving the problem of sexual exploitation of children they would stop wasting their (and everybody else’s) time and energy on wishful thinking, misdirection, and technosolutionism.
Or read this on Hacker News