Get the latest tech news

Effective CSAM filters are impossible because what CSAM is depends on context


Automatically tagging or filtering child sexual exploitation materials (CSAM) cannot be effective and preserve privacy at the same time, regardless of what kind of tech one throws at it. Because what

Automatically tagging or filtering child sexual exploitation materials (CSAM) cannot be effective and preserve privacy at the same time, regardless of what kind of tech one throws at it. Literally the same photo, bit-by-bit identical, can be an innocent memorabilia when sent between family members, and a case of CSAM if shared on a child porn group. If politicians wanted to be serious about solving the problem of sexual exploitation of children they would stop wasting their (and everybody else’s) time and energy on wishful thinking, misdirection, and technosolutionism.

Get the Android app

Or read this on Hacker News

Read more on:

Photo of Context

Context

Photo of CSAM

CSAM

Related news:

News photo

EU Delays Decision Over Scanning Encrypted Messages For CSAM

News photo

EU delays decision over scanning encrypted messages for CSAM

News photo

YouTube will soon ask audiences to add context to videos