Nightshade app ‘poisons’ AI models copying your art [Awesome Apps]

By

Nightshade app
Researchers at the University of Chicago have given artists a tool to fight AI models imitating their artwork.
Photo: University of Chicago
Awesome Apps

Artists have every reason to fear artificial intelligence (AI) models that might train on their artwork online without permission, essentially imitating it and potentially diluting its value.

A new free app called Nightshade — yeah, after the poisonous plant — is here to mess with AI models to deter trainers from doing that.

Nightshade app

Nightshade comes from the developers behind the Glaze app from the University of Chicago, and it serves a similar purpose. Glaze confuses AI regarding an image’s artistic style so it can’t be copied accurately. Nightshade throws off an AI model’s sense of what’s pictured and causes ongoing disruption to it.

The developers describe Glaze as a “defensive” tool against style copying and Nightshade as an “offensive” tool that distorts AI models. They suggest two be used in tandem and plan to combine them after further testing.

“Glaze is a defensive tool that individual artists can use to protect themselves against style mimicry attacks, while Nightshade is an offensive tool that artists can use as a group to disrupt models that scrape their images without consent (thus protecting all artists against these models),” the developers wrote.

Confusing AI models at the pixel level

Nightshade uses an open-source machine-learning framework to apply a tag that subtly alters an image at the pixel level so AI sees something other than what the image depicts.

It serves as a deterrent because it carries a consequence for the AI model training on art without consent — the model may carry its wrong impression elsewhere, applying the wrong imagery in different situations.

Here’s more description from the devs:

We have designed … a tool that turns any image into a data sample that is unsuitable for model training. More precisely, Nightshade transforms images into “poison” samples, so that models training on them without consent will see their models learn unpredictable behaviors that deviate from expected norms, e.g. a prompt that asks for an image of a cow flying in space might instead get an image of a handbag floating in space.

Used responsibly, Nightshade can help deter model trainers who disregard copyrights, opt-out lists, and do-not-scrape/robots.txt directives. It does not rely on the kindness of model trainers, but instead associates a small incremental price on each piece of data scraped and trained without authorization. Nightshade’s goal is not to break models, but to increase the cost of training on unlicensed data, such that licensing images from their creators becomes a viable alternative.

Anyone who downloads Nightshade can apply it to any artwork, though the developers urge caution when using it. And to use it you need a Mac with an M1, M2 or M3 chip or a PC running Windows 10 or 11.

Price: Free

Where to download:  Nightshade, University of Chicago

Newsletters

Daily round-ups or a weekly refresher, straight from Cult of Mac to your inbox.

  • The Weekender

    The week's best Apple news, reviews and how-tos from Cult of Mac, every Saturday morning. Our readers say: "Thank you guys for always posting cool stuff" -- Vaughn Nevins. "Very informative" -- Kenly Xavier.