Get the latest tech news

AI art protection tools like Glaze and NightShade are easy to get around, researchers say


Artists urgently need stronger defences to protect their work from being used to train AI models without their consent.

The tools are popular with digital artists who want to stop artificial intelligence models (like the AI art generator Stable Diffusion) from copying their unique styles without consent. LightShed can detect, reverse-engineer and remove these distortions, effectively stripping away the poisons and rendering the images usable again for Generative AI model training. Although LightShed reveals serious vulnerabilities in art protection tools, the researchers stress that it was developed not as an attack on them – but rather an urgent call to action to produce better, more adaptive ones.

Get the Android app

Or read this on r/technology

Read more on:

Photo of researchers

researchers

Photo of Nightshade

Nightshade

Photo of Glaze

Glaze

Related news:

News photo

Researchers get viable mice by editing DNA from two sperm | Altering chemical modifications of DNA lets the DNA from two sperm make a mouse.

News photo

Researchers using the same data and hypothesis arrive at different conclusions (2022)

News photo

Latest Parkinson’s puzzle piece could mean earlier diagnosis | Researchers have found immune cells that are active long before Parkinson's symptoms appear