Get the latest tech news
Instagram Ads Send This Nudify Site 90 Percent of Its Traffic
A service for creating AI-generated nude images of real people is running circles around Meta’s moderation efforts.
As I reported last week, extensive testing by AI Forensics, a European non-profit that investigates influential and opaque algorithms, found that nudity uploaded to Instagram and Facebook as a normal user was promptly removed for violating Meta’s Community Standards. “This is a highly adversarial space and bad actors are constantly evolving their tactics to avoid enforcement, which is why we continue to invest in the best tools and technology to help identify and remove violating content.” As we’ve previously reported, these nudify apps are some of the most harmful applications of generative AI because they make it so easy to create nonconsensual images of anyone.
Or read this on r/technology