Get the latest tech news

Google’s Nonconsensual Explicit Images Problem Is Getting Worse


Reports of intimate images and video posted online without consent are growing, and deepfakes add a horrifying new dimension to the problem. Google insiders say they’ve struggled to get executives to act.

The sources describe previously unreported internal deliberations, including Google’s rationale for not using an industry tool called StopNCII that shares information about nonconsensual intimate imagery (NCII) and the company’s failure to demand that porn websites verify consent to qualify for search traffic. Porn producers, who collect identity information from performers as required by US law, support the sharing of a consent signal with search engines, says Mike Stabile, spokesperson for the industry trade body Free Speech Coalition. That same source says staff persuaded executives to update the policy in part by describing the importance of letting people who had become adult performers on OnlyFans out of financial necessity to later revoke their consent and shred any ties to sex work.

Get the Android app

Or read this on Wired

Read more on:

Photo of Google

Google

Related news:

News photo

Google Wants To Start Tracking 300 Million iPhone Users Within 5 Years

News photo

Google's AI search summaries use 10x more energy than just doing a normal Google search

News photo

New AI Training Technique Is Drastically Faster, Says Google