Get the latest tech news
AI art protection tools like Glaze and NightShade are easy to get around, researchers say
Artists urgently need stronger defences to protect their work from being used to train AI models without their consent.
The tools are popular with digital artists who want to stop artificial intelligence models (like the AI art generator Stable Diffusion) from copying their unique styles without consent. LightShed can detect, reverse-engineer and remove these distortions, effectively stripping away the poisons and rendering the images usable again for Generative AI model training. Although LightShed reveals serious vulnerabilities in art protection tools, the researchers stress that it was developed not as an attack on them – but rather an urgent call to action to produce better, more adaptive ones.
Or read this on r/technology