Get the latest tech news
Google’s Nonconsensual Explicit Images Problem Is Getting Worse
Reports of intimate images and video posted online without consent are growing, and deepfakes add a horrifying new dimension to the problem. Google insiders say they’ve struggled to get executives to act.
The sources describe previously unreported internal deliberations, including Google’s rationale for not using an industry tool called StopNCII that shares information about nonconsensual intimate imagery (NCII) and the company’s failure to demand that porn websites verify consent to qualify for search traffic. Porn producers, who collect identity information from performers as required by US law, support the sharing of a consent signal with search engines, says Mike Stabile, spokesperson for the industry trade body Free Speech Coalition. That same source says staff persuaded executives to update the policy in part by describing the importance of letting people who had become adult performers on OnlyFans out of financial necessity to later revoke their consent and shred any ties to sex work.
Or read this on Wired