Artists Are Using Nightshade Data Poisoning Tool In Escalating War With Generative AI

hero bottle of poison
Artists are using a new tool to add invisible changes to their art before uploading online. Why? In order to combat having their work scraped into an AI training set. The new method causes the resulting model to "break in chaotic and unpredictable ways."

Artists are battling to protect their work against being stolen by generative AI, with some going as far as launching lawsuits. With OpenAI recently opening up its ChatGPT to the entire internet, the battle will likely only escalate. However, artists can use a new tool to protect their valuable work, called Nightshade.

ai generated robot on fire
Example of AI-generated art.

Nightshade is designed to fight back against AI companies that attempt to use an artist's work to train its models without the artist's permission. Essentially, the new technique is used to "poison" the training data and damage future iterations of image-generating models, such as DALL-E, Midjourney, and Stable Diffusion. The hope is that by doing so, any resulting AI-generated images of the infected work will produce useless copies.

The developers of Nightshade say it is "an optimized prompt-specific poisoning attack where poison samples look visually identical to benign images with matching text prompts." The samples can also corrupt a Stable Diffusion SKXL prompt in <100 poison samples. The creators of Nightshade add that it should only be implemented as a last defense against web scrapers that ignore opt-out/do-not-crawl directives.

Ben Zhao, a professor at the University of Chicago who led the team that created Nightshade, remarked that he hoped the new technology would help tip the balance of power back to the artists. According to an article by MIT Technology Review, his team is also responsible for creating Glaze. This tool allows artists to "mask" their own personal style to prevent it from being scraped by AI companies.

Junfeng Yang, a computer science professor at Columbia University, remarked that Nightshade will make AI companies think twice, because of the possibility of "destroying their entire model" by taking an artist's work without consent.