<!doctype html public "-//W3C//DTD W3 HTML//EN">
<html><head><style type="text/css"><!--
blockquote, dl, ul, ol, li { padding-top: 0 ; padding-bottom: 0 }
--></style><title>nightshade</title></head><body>
<div>The Glaze Project has released a new, free software tool called <a
href="https://nightshade.cs.uchicago.edu/">Nightshade</a>that artists
can apply to their work posted online that <a
href=
"https://venturebeat.com/ai/nightshade-the-free-tool-that-poisons-ai-models-is-now-available-for-artists-to-use/"><span
></span>will "poison" Artificial Intelligence models that
seek to train on that artwork</a>.<br>
</div>
<div>AI models "train" themselves by having bots scour the
internet and scrape data on public websites. They "learn"
about the artwork they scrape in order to eventually be able to create
their own, derivative art. But the artists whose work was used for AI
models to train themselves did not consent to this practice, and the
AI models then threaten their livelihood by mimicking their art and
competing with them.<br>
</div>
<blockquote>Developed by computer scientists on <a
href="https://glaze.cs.uchicago.edu/index.html">the Glaze
Project</a> at the University of Chicago under Professor Ben
Zhao, <b>the tool essentially works by turning AI against
AI.</b> It makes use of the popular open-source machine learning
framework <a
href=
"https://venturebeat.com/ai/pytorch-2-0-brings-new-fire-to-open-source-machine-learning/"><span
></span>PyTorch</a> to identify what's in a given image, then
applies a tag that subtly alters the image at the pixel level so other
AI programs see something totally different than what's actually
there. Š<br>
</blockquote>
<blockquote><b>An AI model that ended up training on many images
altered or "shaded" with Nightshade would likely erroneously
categorize objects going forward for all users of that model, even in
images that had not been shaded with Nightshade.</b><br>
</blockquote>
<div><a href="https://nightshade.cs.uchicago.edu/whatis.html">The
Nightshade team</a> explains:<br>
</div>
<blockquote><i>Nightshade is computed as a multi-objective
optimization that minimizes visible changes to the original image.
While human eyes see a shaded image that is largely unchanged from the
original, the AI model sees a dramatically different composition in
the image. For example, human eyes might see a shaded image
of a cow in a green field largely unchanged, but an AI model might see
a large leather purse lying in the grass. Trained on a sufficient
number of shaded images that include a cow, a model will become
increasingly convinced cows have nice brown leathery handles and
smooth side pockets with a zipper, and perhaps a lovely brand
logo.</i><br>
</blockquote>
<div>So the AI model would not only confuse a cow with a purse in the
artwork to which the Nightshade tool has been applied, it would begin
to confuse cows with purses for any artwork it scrapes, whether
Nightshade has been applied to that image or not.<br>
</div>
<div>The team sees this tool as a way for artists to fight back
against generative AI models that can ignore rules and opt-out lists
with impunity.</div>
<div> </div>
</body>
</html>