• About
    • Mission
    • Team
    • Boards
    • Mentions & Testimonials
    • Institutional Recognition
    • Annual Reports
    • Current & Past Sponsors
    • Contact Us
  • Resources
    • Article Collection
    • Podcast: Art in Brief
    • AML and the Art Market
    • AI and Art Authentication
    • Newsletter
      • Subscribe
      • Archives
      • In Brief
    • Art Law Library
    • Movies
    • Nazi-looted Art Restitution Database
    • Global Network
      • Courses and Programs
      • Artists’ Assistance
      • Bar Associations
      • Legal Sources
      • Law Firms
      • Student Societies
      • Research Institutions
    • Additional resources
      • The “Interview” Project
  • Events
    • Worldwide Calendar
    • Our Events
      • All Events
      • Annual Conferences
        • 2025 Art Law Conference
        • 2024 Art Law Conference
        • 2023 Art Law Conference
        • 2022 Art Law Conference
        • 2015 Art Law Conference
  • Programs
    • Visual Artists’ Legal Clinics
      • Art & Copyright Law Clinic
      • Artist-Dealer Relationships Clinic
      • Artist Legacy and Estate Planning Clinic
      • Visual Artists’ Immigration Clinic
    • Summer School
      • 2025
    • Internship and Fellowship
    • Judith Bresler Fellowship
  • Case Law Database
  • 2025 Year-End Appeal
  • Log in
  • Become a Member
  • Donate
  • 2025 Year-End Appeal
  • Log in
  • Become a Member
  • Donate
Center for Art Law
  • About
    About
    • Mission
    • Team
    • Boards
    • Mentions & Testimonials
    • Institutional Recognition
    • Annual Reports
    • Current & Past Sponsors
    • Contact Us
  • Resources
    Resources
    • Article Collection
    • Podcast: Art in Brief
    • AML and the Art Market
    • AI and Art Authentication
    • Newsletter
      Newsletter
      • Subscribe
      • Archives
      • In Brief
    • Art Law Library
    • Movies
    • Nazi-looted Art Restitution Database
    • Global Network
      Global Network
      • Courses and Programs
      • Artists’ Assistance
      • Bar Associations
      • Legal Sources
      • Law Firms
      • Student Societies
      • Research Institutions
    • Additional resources
      Additional resources
      • The “Interview” Project
  • Events
    Events
    • Worldwide Calendar
    • Our Events
      Our Events
      • All Events
      • Annual Conferences
        Annual Conferences
        • 2025 Art Law Conference
        • 2024 Art Law Conference
        • 2023 Art Law Conference
        • 2022 Art Law Conference
        • 2015 Art Law Conference
  • Programs
    Programs
    • Visual Artists’ Legal Clinics
      Visual Artists’ Legal Clinics
      • Art & Copyright Law Clinic
      • Artist-Dealer Relationships Clinic
      • Artist Legacy and Estate Planning Clinic
      • Visual Artists’ Immigration Clinic
    • Summer School
      Summer School
      • 2025
    • Internship and Fellowship
    • Judith Bresler Fellowship
  • Case Law Database
Home image/svg+xml 2021 Timothée Giet Art law image/svg+xml 2021 Timothée Giet Can This Data Poisoning Tool Help Artists Protect Their Work from AI Scraping?
Back

Can This Data Poisoning Tool Help Artists Protect Their Work from AI Scraping?

November 21, 2023

Art by Patrick K. Lin, created using Canva. - moving image with a skull and a robot finger pointing to a computer chip.

Art by Patrick K. Lin, created using Canva.

By Patrick K. Lin

Generative AI tools like DALL·E, Midjourney, and Stable Diffusion are dominating the cultural zeitgeist but have not received a warm reception from artists. AI companies extract billions of images from the web, relying on artists’ pre-existing works to feed their models.[1]

The result is AI image generators producing images that contain artists’ visual artifacts and even artists’ signatures. For many artists, the speed and scalability of generative AI threatens to devalue the labor of creative expression or outright eliminate it. For instance, Marvel’s recent Disney+ show “Secret Invasion” featured an AI-generated opening credits sequence, sparking fears of replacing artists with AI tools. Similarly, during the writers’ strike, tech companies showed off a fake, AI-generated TV episode.

AI companies are facing a wave of lawsuits alleging copyright infringement, such as the class action against Midjourney, Stable Diffusion, and DeviantArt for copyright infringement as well as the Getty Images lawsuit against Stable Diffusion creator Stability AI. However, outside of taking expensive legal action, artists have very few defenses and tools at their disposal to deter companies and web scrapers from feeding their art to AI models without consent or compensation.

Enter NightShade, a data poisoning tool developed by a team of researchers from the University of Chicago.[2] Nightshade changes an image’s pixels in a way that is imperceptible to the human eye. Going forward, machine learning models, however, will detect these subtle changes, which are carefully designed to hinder models’ ability to label their images. If an AI model is trained on these “poisoned samples,” the invisible features of these images gradually corrupt the model.[3]

This process of data poisoning involves contributing inaccurate or meaningless data, thus encouraging the underlying AI model to perform poorly.[4] Data poisoning attacks manipulate training data to introduce unexpected behaviors into machine learning models at the training stage.[5] AdNauseam, for instance, is a browser extension that clicks on every single ad sent your way, which confuses Google’s ad-targeting algorithms. In the art and generative AI context, poisoned data samples can manipulate models into learning to label a hat as cake or cartoon art as impressionism.

A chart from the Nightshade research team’s paper, displaying examples of images generated by the Nightshade-poisoned model compared to the original clean model.

Because AI companies train their models on vast datasets, poisoned data is very difficult to remove. Identifying poisonous images requires AI companies to painstakingly find and remove each corrupted sample. If the training set is large enough, removing all copyrighted or sensitive information from an AI model can require effectively retraining the AI from scratch, which can cost tens of millions of dollars.[6] Ironically, this is often the very same reason companies give for why biased or nonconsensual data cannot be removed.

When the Nightshade team fed 50 poisoned images, which labeled pictures of cars as cows, into Stable Diffusion, the model started generating distorted images of cars.[7] After 100 samples, the model began producing images that had more cow-like features than car-like ones. At 300 images, virtually no car-like features remained.

The research team that created NightShade also developed Glaze, a tool that allows artists to “mask” their personal style to prevent it from being scraped by AI companies. The advent of text-to-image generative models has resulted in companies and grifters taking artists’ work to train models to recreate their style. Glaze works in a similar way to NightShade, changing the pixels of images in subtle ways that are invisible to the human eye but convince machine learning models to interpret the image as something else. Glaze received a “Special Mention” award in TIME Best Inventions of 2023.

A chart from the Nightshade research team’s paper, comparing clean and poisoned images and demonstrating how related prompts are also corrupted by the poisoning via a bleed through effect.

If NightShade can effectively break text-to-image models, then the AI companies that develop these models may finally have to respect artists’ rights. For instance, the deterrent effect of data poisoning may motivate AI companies to seek permission from artists and compensate them for continued use of their work.

Although some developers of text-to-image generative models, like Stability AI and OpenAI, have offered to let artists opt out of having their images used to train future versions of the models. These opt-out policies place the onus on artists to reclaim their work rather than on the AI companies systematically scraping images online.

Tools like NightShade should give AI companies pause. With any luck, the risk of destroying their entire model should force companies to think twice before taking artists’ work without their consent.

About the Author

Patrick K. Lin is the Center for Art Law’s 2023-2024 Judith Bresler Fellow and author of Machine See, Machine Do, a book about how public institutions use technology to surveil, police, and make decisions about the public, as well as the historical biases that impact that technology. Patrick is interested in legal issues that exist at the intersection of art and technology, particularly involving artificial intelligence, data privacy, and copyright law.

Suggested Readings

  1. Andy Baio, Exploring 12 Million of the 2.3 Billion Images Used to Train Stable Diffusion’s Image Generator, Wᴀxʏ (Aug. 30, 2022), https://waxy.org/2022/08/exploring-12-million-of-the-images-used-to-train-stable-diffusions-image-generator/. ↑
  2. Shawn Shan, Wenxin Ding, Josephine Passananti, Haitao Zheng, Ben Y. Zhao, Prompt-Specific Poisoning Attacks on Text-to-Image Generative Models, ᴀʀXɪᴠ (Oct. 20, 2023), https://arxiv.org/abs/2310.13828. ↑
  3. Melissa Heikkilä, This new Data Poisoning Tool Lets Artists Fight Back Against Generative AI, MIT Tᴇᴄʜɴᴏʟᴏɢʏ Rᴇᴠɪᴇᴡ (Oct. 23, 2023), https://www.technologyreview.com/2023/10/23/1082189/data-poisoning-artists-fight-generative-ai/. ↑
  4. James Thorpe, What is Data Poisoning & Why Should You Be Concerned?, Iɴᴛᴇʀɴᴀᴛɪᴏɴᴀʟ Sᴇᴄᴜʀɪᴛʏ Jᴏᴜʀɴᴀʟ (Sept. 13, 2021), https://internationalsecurityjournal.com/data-poisoning/. ↑
  5. Id. ↑
  6. Lauren Leffer, Your Personal Information Is Probably Being Used to Train Generative AI Models, Sᴄɪᴇɴᴛɪғɪᴄ Aᴍᴇʀɪᴄᴀɴ (Oct. 19, 2023), https://www.scientificamerican.com/article/your-personal-information-is-probably-being-used-to-train-generative-ai-models/. ↑
  7. Teresa Nowakowski, Artists Can Use This Tool to Protect Their Work From A.I. Scraping, Tʜᴇ Sᴍɪᴛʜsᴏɴɪᴀɴ (Nov. 3, 2023), https://www.smithsonianmag.com/smart-news/this-tool-uses-poison-to-help-artists-protect-their-work-from-ai-scraping-180983183/. ↑

 

Disclaimer: This article is for educational purposes only and is not meant to provide legal advice. Readers should not construe or rely on any comment or statement in this article as legal advice. For legal advice, readers should seek a consultation with an attorney.

Post navigation

Previous Interview with Lawrence Kaye and Howard Spiegler about the State of Cultural Reparations Law
Next The Terry House: Case Study about Donor Conditions

Related Posts

Photographs and Richard Prince: The Gifts that Keep on Giving

February 24, 2014

Case Review: US v. Philbrick (2022)

November 7, 2022

Museums Increasingly Face Copyright Issues

June 3, 2013
Center for Art Law
A Gift for You

A Gift for You

this Holiday Season

Celebrate the holidays with 20% off your annual subscription — claim your gift now!

 

Get your Subscription Today!
Guidelines AI and Art Authentication

AI and Art Authentication

Explore the new Guidelines for AI and Art Authentication for the responsible, ethical, and transparent use of artificial intelligence.

Download here
Center for Art Law

Follow us on Instagram for the latest in Art Law!

Less than a week left in December and together we Less than a week left in December and together we have raised nearly $32,000 towards our EOY fundraising $35,000 goal. If we are ever camera shy to speak about our accomplishments or our goals, our work and our annual report speak for themselves. 

Don’t let the humor and the glossy pictures fool you, to reach our full potential and new heights in 2026, we need your vote of confidence. No contribution is too small. What matters most is knowing you are thinking of the Center this holiday season. Thank you, as always, for your support and for being part of this community! 

#artlaw #EOYfundraiser #growingin2026 #AML #restitution #research #artistsright #contracts #copyright #bringfriends
This summer, art dealer James White and appraiser This summer, art dealer James White and appraiser Paul Bremner pleaded guilty for their participation in the third forgery ring of Norval Morisseau works uncovered by Canadian authorities. Their convictions are a key juncture in Canda's largest art fraud scheme, a scandal that has spanned decades and illuminated deep systemic failures within the art market to protect against fraud. 

Both White and Bremner were part of what is referred to as the 'Cowan Group,' spearheaded by art dealer Jeffrey Cowan. Their enterprise relied on Cowan fabricating provenance for the forged works, which he claimed were difficult to authenticate. 

In June, White, 87, pleaded guilty to to creating forged documents and possessing property obtained by crime for the purpose of trafficking. Later, in July, Paul Bremner pleaded guilty to producing and using forged documents and possessing property obtained through crime with the intent of trafficking. While Bremner, White, and Cowan were all supposed to face trial in the Fall, Cowan was the only one to do so and was ultimately found guilty on four counts of fraud. 

🔗 Click the link in our bio to read more.

#centerforartlaw #artlaw #legalresearch #artfraud #artforgery #canada #artcrime #internationallaw
It's the season! It's the season!
In 2022, former art dealer Inigo Philbrick was sen In 2022, former art dealer Inigo Philbrick was sentenced to seven years in prison for committing what is considered one of the United States' most significant cases of art fraud. With access to Philbrick's personal correspondence, Orlando Whitfield chronicled his friendship with the disgraced dealer in a 2024 memoir, All that Glitters: A Story of Friendship, Fraud, and Fine Art. 

For more insights into the fascinating story of Inigo Philbrick, and those he defrauded, read our recent book review. 

🔗 Click the link in our bio to read more!

#centerforartlaw #legalresearch #artlaw #artlawyer #lawer #inigophilbrick #bookreview #artfraud
The highly publicized Louvre heist has shocked the The highly publicized Louvre heist has shocked the globe due to its brazen nature. However, beyond its sheer audacity, the heist has exposed systemic security weaknesses throughout the international art world. Since the theft took place on October 19th, the French police have identified the perpetrators, describing them as local Paris residents with records of petty theft. 

In our new article, Sarah Boxer explores parallels between the techniques used by the Louvre heists’ perpetrators and past major art heists, identifying how the theft reveals widespread institutional vulnerability to art crime. 

🔗 Click the link in our bio to read more!

#centerforartlaw #artlaw #legalresearch #artcrime #theft #louvre #france #arttheft #stolenart
In September 2025, 77-year old Pennsylvania reside In September 2025, 77-year old Pennsylvania resident Carter Reese made headlines not only for being Taylor Swift's former neighbor, but also for pleading guilty to selling forgeries of Picasso, Basquiat, Warhol, and others. This and other recent high profile forgery cases are evidence of the art market's ongoing vulnerability to fraudulent activity. Yet, new innovations in DNA and artificial intelligence (AI) may help defend against forgery. 

To learn more about how the art market's response to fraud and forgery is evolving, read our new article by Shaila Gray. 

🔗 Click the link in our bio to read more!

#centerforartlaw #artlaw #legalresearch #artlawyer #lawyer #AI #forgery #artforgery #artfakes #authenticity
Did you know that Charles Dickens visited America Did you know that Charles Dickens visited America twice, in 1842 and in 1867? In between, he wrote his famous “A Tale of Two Cities,” foreshadowing upheavals and revolutions and suggesting that individual acts of compassion, love, and sacrifice can break cycles of injustice. With competing demands and obligations, finding time to read books in the second quarter of the 21st century might get increasingly harder. As we live in the best and worst of times again, try to enjoy the season of light and a good book (or a good newsletter).

From all of us at the Center for Art Law, we wish you peace, love, and understanding this holiday season. 

🔗 Read more by clicking the link in our bio!

#centerforartlaw #artlaw #legalresearch #artlawyer #december #newsletter #lawyer
Is it, or isn’t it, Vermeer? Trouble spotting fake Is it, or isn’t it, Vermeer? Trouble spotting fakes? You are not alone. Donate to the Center for Art Law, we are the real deal. 

🔗 Click the link in our bio to donate today!

#centerforartlaw #artlaw #legalresearch #endofyear #givingtuesday #donate #notacrime #framingartlaw
Whether legal systems are ready or not, artificial Whether legal systems are ready or not, artificial intelligence is making its way into the courtroom. AI-generated evidence is becoming increasingly common, but many legal professionals are concerned that existing legal frameworks aren't sufficient to account for ethical dilemmas arising from the technology. 

To learn more about the ethical arguments surrounding AI-generated evidence, and what measures the US judiciary is taking to respond, read our new article by Rebecca Bennett. 

🔗 Click the link in our bio to read more!

#centerforartlaw #artlaw #legalresearch #artlawyer #lawyer #aiart #courtissues #courts #generativeai #aievidence
Interested in the world of art restitution? Hear f Interested in the world of art restitution? Hear from our Lead Researcher of the Nazi-Era Looted Art Database, Amanda Buonaiuto, about the many accomplishments this year and our continuing goals in this space. We would love the chance to do even more amazing work, your donations can give us this opportunity! 

Please check out the database and the many recordings of online events we have regarding the showcase on our website.

Help us reach our end of year fundraising goal of $35K.

🔗 Click the link in our bio to donate ❤️🖤
Make sure to grab your tickets for our discussion Make sure to grab your tickets for our discussion on the legal challenges and considerations facing General Counsels at leading museums, auction houses, and galleries on December 17. Tune in to get insight into how legal departments navigate the complex and evolving art world.

The panel, featuring Cindy Caplan, General Counsel, The Jewish Museum, Jason Pollack, Senior Vice President, General Counsel, Americas, Christie’s and Halie Klein, General Counsel, Pace Gallery, will address a range of pressing issues, from the balancing of legal risk management with institutional missions, combined with the need to supervise a variety of legal issues, from employment law to real estate law. The conversation will also explore the unique role General Counsels play in shaping institutional policy.

This is a CLE Event. 1 Credit for Professional Practice Pending Approval.

🎟️ Make sure to grab your tickets using the link in our bio! 

#centerforartlaw #artlaw #legalresearch #generalcounsel #museumissues #artauctions #artgallery #artlawyer #CLE
While arts funding is perpetually scarce, cultural While arts funding is perpetually scarce, cultural heritage institutions particularly struggle during and after armed conflict. In such circumstances, funds from a variety of sources including NGOs, international organizations, national and regional institutions, and private funds all play a crucial role in protecting cultural heritage. 

Read our new article by Andrew Dearman to learn more about the organizations funding emergency cultural heritage protection in the face of armed conflict, as well as the factors hindering effective responses. 

🔗 Click the link in our bio to read more! 

#centerforartlaw #artlaw #legalresearch #lawyer #artlawyer #culturalheritage #armedconflict #UNESCO
  • About the Center
  • Contact Us
  • Newsletter
  • Upcoming Events
  • Internship
  • Case Law Database
  • Log in
  • Become a Member
  • Donate
DISCLAIMER

Center for Art Law is a New York State non-profit fully qualified under provision 501(c)(3)
of the Internal Revenue Code.

The Center does not provide legal representation. Information available on this website is
purely for educational purposes only and should not be construed as legal advice.

TERMS OF USE AND PRIVACY POLICY

Your use of the Site (as defined below) constitutes your consent to this Agreement. Please
read our Terms of Use and Privacy Policy carefully.

© 2025 Center for Art Law
 

Loading Comments...
 

You must be logged in to post a comment.