Artists Can Use This Tool to Protect Their Work From A.I. Scraping

Nightshade subtly alters the pixels of an image to mislead A.I. image generators, ultimately damaging the models

Artificial intelligence art
Researchers at the University of Chicago have developed a new technique that allows artists to embed invisible “poison” into their work that misleads A.I. models. Pexels

As artificial intelligence image generators become more popular and powerful, artists worry that their work will be used without permission to train tools like DALL-EMidjourney and Stable Diffusion.

Now, researchers at the University of Chicago have developed a technique that artists can use to embed invisible “poison” in their work, reports MIT Technology Review’s Melissa Heikkilä. The tool, called Nightshade, changes an image’s pixels in a way that humans can’t detect.

Computers, however, will notice these changes, which are carefully designed to impair A.I. models’ ability to label their images. If an A.I. model is trained on these kinds of images, its abilities will begin to break down. It will learn, for example, that cars are cows, or that cartoon art is Impressionism.

“This way, to a human or simple automated check, the image and the text seem aligned,” writes Ars Technica’s Benj Edwards. “But in the model’s latent space, the image has characteristics of both the original and the poison concept, which leads the model astray when trained on the data.”

Because models are trained on vast datasets, identifying poisonous images is a complex and time-consuming task for tech companies—and even just a few misleading samples can do damage. When researchers fed 50 poisoned images, which labeled pictures of dogs as cats, into Stable Diffusion, the model started generating distorted images of dogs. After 100 samples, the model began producing images that were more cat than dog. At 300, virtually no doglike features remained.

Previously, the team released a similar tool called Glaze, which disguises an artist’s style from A.I. tools trying to parse it. Nightshade will eventually be integrated into Glaze.

Ultimately, researchers hope Nightshade can help give artists more power as they face off against A.I., as Ben Zhao, a computer scientist at the University of Chicago who led the Nightshade team, tells Hyperallergic’s Elaine Velie.

“I think right now there’s very little incentive for companies to change the way that they have been operating—which is to say, ‘Everything under the sun is ours, and there’s nothing you can do about it,’” he says. “I guess we’re just sort of giving them a little bit more nudge towards the ethical front, and we’ll see if it actually happens.”

While Nightshade can protect artists’ work from newer models, it can’t retroactively protect art from older ones. “It works at training time and destabilizes [the model] for good,” Zhang tells Ryan Heath of Axios. “Of course, the model trainers can just revert to an older model, but it does make it challenging for them to build new models.”

As Zhao tells MIT Technology Review, there is a chance that Nightshade’s technique could be misused for malicious purposes. Even so, he says, a targeted attack would be difficult, as it would require thousands of poisoned samples to inflict damage on larger models that are trained on billions of data samples.

Nightshade is an important step in the fight to defend artists going up against tech companies, says Marian Mazzone, a scholar in modern and contemporary art at the College of Charleston who also works for the Art and Artificial Intelligence Laboratory at Rutgers University.

“Artists now have something they can do, which is important,” she tells Hyperallergic. “Feeling helpless is no good.”

At the same time, Mazzone worries Nightshade may not be a long-term solution. She thinks that creators should continue to pursue legislative action connected to A.I. image generation, as corporations’ financial resources and A.I. technology’s rapid evolution could eventually make programs like Nightshade obsolete.

In the meantime, Nightshade’s existence is a morale booster for some artists, like Autumn Beverly. She tells MIT Technology Review that after discovering her work had been scraped without her consent, she stopped posting her art online. Tools like Nightshade and Glaze have made her comfortable sharing her work on the internet again.

“I’m just really grateful that we have a tool that can help return the power back to the artists for their own work,” she says.

Get the latest stories in your inbox every weekday.