Get the latest tech news

Nightshade, the free tool that ‘poisons’ AI models, is now available for artists to use


The tool's creators are seeking to make it so that AI model developers must pay artists to train on data from them that is uncorrupted.

The Glaze/Nightshade team, for its part, denies it is seeking destructive ends, writing:”Nightshade’s goal is not to break models, but to increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative.” It’s been a common practice on the internet and used frequently prior to the advent of generative AI, and is roughly the same technique used by Google and Bing to crawl and index websites in search results. It does not rely on the kindness of model trainers, but instead associates a small incremental price on each piece of data scraped and trained without authorization.”

Get the Android app

Or read this on Venture Beat

Read more on:

Photo of AI models

AI models

Photo of Artists

Artists

Photo of free tool

free tool

Related news:

News photo

EU wants music streaming platforms to pay artists more fairly

News photo

Music Streaming Platforms Must Pay Artists More, Says EU

News photo

EU says music streaming platforms must pay artists more