Get the latest tech news
Nightshade, a free tool to poison AI data scraping, is now available to try
Nightshade was initially introduced in October 2023 as a tool to challenge large generative AI models. Created by researchers at the University of Chicago, Nightshade is an...
Created by researchers at the University of Chicago, Nightshade is an "offensive" tool that can protect artists' and creators' work by "poisoning" an image and making it unsuitable for AI training. A shaded image of a cow in a green field can be transformed into what AI algorithms will interpret as a large leather purse lying in the grass, the researchers said. Glaze offers a "defensive" approach to content poisoning and can be used by individual artists to protect their creations against "style mimicry attacks," the researchers explained.
Or read this on r/technology