Get the latest tech news
Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to use
The tool's creators are seeking to make it so that AI model developers must pay artists to train on data from them that is uncorrupted.
The Glaze/Nightshade team, for its part, denies it is seeking destructive ends, writing:”Nightshade’s goal is not to break models, but to increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative.” It’s been a common practice on the internet and used frequently prior to the advent of generative AI, and is roughly the same technique used by Google and Bing to crawl and index websites in search results. It does not rely on the kindness of model trainers, but instead associates a small incremental price on each piece of data scraped and trained without authorization.”
Or read this on Venture Beat