• About
    • Mission
    • Team
    • Boards
    • Mentions & Testimonials
    • Institutional Recognition
    • Annual Reports
    • Current & Past Sponsors
    • Contact Us
  • Resources
    • Article Collection
    • Podcast: Art in Brief
    • AML and the Art Market
    • AI and Art Authentication
    • Newsletter
      • Subscribe
      • Archives
      • In Brief
    • Art Law Library
    • Movies
    • Nazi-looted Art Restitution Database
    • Global Network
      • Courses and Programs
      • Artists’ Assistance
      • Bar Associations
      • Legal Sources
      • Law Firms
      • Student Societies
      • Research Institutions
    • Additional resources
      • The “Interview” Project
  • Events
    • Worldwide Calendar
    • Our Events
      • All Events
      • Annual Conferences
        • 2025 Art Law Conference
        • 2024 Art Law Conference
        • 2023 Art Law Conference
        • 2022 Art Law Conference
        • 2015 Art Law Conference
  • Programs
    • Visual Artists’ Legal Clinics
      • Art & Copyright Law Clinic
      • Artist-Dealer Relationships Clinic
      • Artist Legacy and Estate Planning Clinic
      • Visual Artists’ Immigration Clinic
    • Summer School
      • 2026
      • 2025
    • Internship and Fellowship
    • Judith Bresler Fellowship
  • Case Law Database
  • Log in
  • Become a Member
  • Donate
  • Log in
  • Become a Member
  • Donate
Center for Art Law
  • About
    About
    • Mission
    • Team
    • Boards
    • Mentions & Testimonials
    • Institutional Recognition
    • Annual Reports
    • Current & Past Sponsors
    • Contact Us
  • Resources
    Resources
    • Article Collection
    • Podcast: Art in Brief
    • AML and the Art Market
    • AI and Art Authentication
    • Newsletter
      Newsletter
      • Subscribe
      • Archives
      • In Brief
    • Art Law Library
    • Movies
    • Nazi-looted Art Restitution Database
    • Global Network
      Global Network
      • Courses and Programs
      • Artists’ Assistance
      • Bar Associations
      • Legal Sources
      • Law Firms
      • Student Societies
      • Research Institutions
    • Additional resources
      Additional resources
      • The “Interview” Project
  • Events
    Events
    • Worldwide Calendar
    • Our Events
      Our Events
      • All Events
      • Annual Conferences
        Annual Conferences
        • 2025 Art Law Conference
        • 2024 Art Law Conference
        • 2023 Art Law Conference
        • 2022 Art Law Conference
        • 2015 Art Law Conference
  • Programs
    Programs
    • Visual Artists’ Legal Clinics
      Visual Artists’ Legal Clinics
      • Art & Copyright Law Clinic
      • Artist-Dealer Relationships Clinic
      • Artist Legacy and Estate Planning Clinic
      • Visual Artists’ Immigration Clinic
    • Summer School
      Summer School
      • 2026
      • 2025
    • Internship and Fellowship
    • Judith Bresler Fellowship
  • Case Law Database
Home image/svg+xml 2021 Timothée Giet Art law image/svg+xml 2021 Timothée Giet Can This Data Poisoning Tool Help Artists Protect Their Work from AI Scraping?
Back

Can This Data Poisoning Tool Help Artists Protect Their Work from AI Scraping?

November 21, 2023

Art by Patrick K. Lin, created using Canva. - moving image with a skull and a robot finger pointing to a computer chip.

Art by Patrick K. Lin, created using Canva.

By Patrick K. Lin

Generative AI tools like DALL·E, Midjourney, and Stable Diffusion are dominating the cultural zeitgeist but have not received a warm reception from artists. AI companies extract billions of images from the web, relying on artists’ pre-existing works to feed their models.[1]

The result is AI image generators producing images that contain artists’ visual artifacts and even artists’ signatures. For many artists, the speed and scalability of generative AI threatens to devalue the labor of creative expression or outright eliminate it. For instance, Marvel’s recent Disney+ show “Secret Invasion” featured an AI-generated opening credits sequence, sparking fears of replacing artists with AI tools. Similarly, during the writers’ strike, tech companies showed off a fake, AI-generated TV episode.

AI companies are facing a wave of lawsuits alleging copyright infringement, such as the class action against Midjourney, Stable Diffusion, and DeviantArt for copyright infringement as well as the Getty Images lawsuit against Stable Diffusion creator Stability AI. However, outside of taking expensive legal action, artists have very few defenses and tools at their disposal to deter companies and web scrapers from feeding their art to AI models without consent or compensation.

Enter NightShade, a data poisoning tool developed by a team of researchers from the University of Chicago.[2] Nightshade changes an image’s pixels in a way that is imperceptible to the human eye. Going forward, machine learning models, however, will detect these subtle changes, which are carefully designed to hinder models’ ability to label their images. If an AI model is trained on these “poisoned samples,” the invisible features of these images gradually corrupt the model.[3]

This process of data poisoning involves contributing inaccurate or meaningless data, thus encouraging the underlying AI model to perform poorly.[4] Data poisoning attacks manipulate training data to introduce unexpected behaviors into machine learning models at the training stage.[5] AdNauseam, for instance, is a browser extension that clicks on every single ad sent your way, which confuses Google’s ad-targeting algorithms. In the art and generative AI context, poisoned data samples can manipulate models into learning to label a hat as cake or cartoon art as impressionism.

A chart from the Nightshade research team’s paper, displaying examples of images generated by the Nightshade-poisoned model compared to the original clean model.

Because AI companies train their models on vast datasets, poisoned data is very difficult to remove. Identifying poisonous images requires AI companies to painstakingly find and remove each corrupted sample. If the training set is large enough, removing all copyrighted or sensitive information from an AI model can require effectively retraining the AI from scratch, which can cost tens of millions of dollars.[6] Ironically, this is often the very same reason companies give for why biased or nonconsensual data cannot be removed.

When the Nightshade team fed 50 poisoned images, which labeled pictures of cars as cows, into Stable Diffusion, the model started generating distorted images of cars.[7] After 100 samples, the model began producing images that had more cow-like features than car-like ones. At 300 images, virtually no car-like features remained.

The research team that created NightShade also developed Glaze, a tool that allows artists to “mask” their personal style to prevent it from being scraped by AI companies. The advent of text-to-image generative models has resulted in companies and grifters taking artists’ work to train models to recreate their style. Glaze works in a similar way to NightShade, changing the pixels of images in subtle ways that are invisible to the human eye but convince machine learning models to interpret the image as something else. Glaze received a “Special Mention” award in TIME Best Inventions of 2023.

A chart from the Nightshade research team’s paper, comparing clean and poisoned images and demonstrating how related prompts are also corrupted by the poisoning via a bleed through effect.

If NightShade can effectively break text-to-image models, then the AI companies that develop these models may finally have to respect artists’ rights. For instance, the deterrent effect of data poisoning may motivate AI companies to seek permission from artists and compensate them for continued use of their work.

Although some developers of text-to-image generative models, like Stability AI and OpenAI, have offered to let artists opt out of having their images used to train future versions of the models. These opt-out policies place the onus on artists to reclaim their work rather than on the AI companies systematically scraping images online.

Tools like NightShade should give AI companies pause. With any luck, the risk of destroying their entire model should force companies to think twice before taking artists’ work without their consent.

About the Author

Patrick K. Lin is the Center for Art Law’s 2023-2024 Judith Bresler Fellow and author of Machine See, Machine Do, a book about how public institutions use technology to surveil, police, and make decisions about the public, as well as the historical biases that impact that technology. Patrick is interested in legal issues that exist at the intersection of art and technology, particularly involving artificial intelligence, data privacy, and copyright law.

Suggested Readings

  1. Andy Baio, Exploring 12 Million of the 2.3 Billion Images Used to Train Stable Diffusion’s Image Generator, Wᴀxʏ (Aug. 30, 2022), https://waxy.org/2022/08/exploring-12-million-of-the-images-used-to-train-stable-diffusions-image-generator/. ↑
  2. Shawn Shan, Wenxin Ding, Josephine Passananti, Haitao Zheng, Ben Y. Zhao, Prompt-Specific Poisoning Attacks on Text-to-Image Generative Models, ᴀʀXɪᴠ (Oct. 20, 2023), https://arxiv.org/abs/2310.13828. ↑
  3. Melissa Heikkilä, This new Data Poisoning Tool Lets Artists Fight Back Against Generative AI, MIT Tᴇᴄʜɴᴏʟᴏɢʏ Rᴇᴠɪᴇᴡ (Oct. 23, 2023), https://www.technologyreview.com/2023/10/23/1082189/data-poisoning-artists-fight-generative-ai/. ↑
  4. James Thorpe, What is Data Poisoning & Why Should You Be Concerned?, Iɴᴛᴇʀɴᴀᴛɪᴏɴᴀʟ Sᴇᴄᴜʀɪᴛʏ Jᴏᴜʀɴᴀʟ (Sept. 13, 2021), https://internationalsecurityjournal.com/data-poisoning/. ↑
  5. Id. ↑
  6. Lauren Leffer, Your Personal Information Is Probably Being Used to Train Generative AI Models, Sᴄɪᴇɴᴛɪғɪᴄ Aᴍᴇʀɪᴄᴀɴ (Oct. 19, 2023), https://www.scientificamerican.com/article/your-personal-information-is-probably-being-used-to-train-generative-ai-models/. ↑
  7. Teresa Nowakowski, Artists Can Use This Tool to Protect Their Work From A.I. Scraping, Tʜᴇ Sᴍɪᴛʜsᴏɴɪᴀɴ (Nov. 3, 2023), https://www.smithsonianmag.com/smart-news/this-tool-uses-poison-to-help-artists-protect-their-work-from-ai-scraping-180983183/. ↑

 

Disclaimer: This article is for educational purposes only and is not meant to provide legal advice. Readers should not construe or rely on any comment or statement in this article as legal advice. For legal advice, readers should seek a consultation with an attorney.

Post navigation

Previous Interview with Lawrence Kaye and Howard Spiegler about the State of Cultural Reparations Law
Next The Terry House: Case Study about Donor Conditions

Related Posts

kernoch center visual art database

Spotlight: Columbia’s Kernochan Center for Law, Media and The Arts reveals new Visual Art Infringement Database

June 30, 2025
logo

What is a Gold Tablet from a German Museum doing In re Flamenbaum estate?

March 1, 2009
The Institute of Art & Law logo

WYWH: Review of the Study Forum organized by the Institute of Art & Law at the Notre Dame University (London, UK, 8 Oct. 2016)

November 5, 2016
Center for Art Law
Center for Art Law

Follow us on Instagram for the latest in Art Law!

Don't miss our up coming in-person, full-day train Don't miss our up coming in-person, full-day training aimed at preparing lawyers for working with art market participants and understanding their unique copyright law needs. The bootcamp will be led by veteran art law attorneys, Louise Carron, Barry Werbin, Carol J. Steinberg, Esq., Scott Sholder, Marc Misthal, specialists in copyright law. 

This Bootcamp provides participants -- attorneys, law students, law graduates and legal professionals -- with foundational legal knowledge related to copyright law for art market clients. Through a combination of instructional presentations and mock consultations, participants will gain a solid foundation in copyright law and its specificities as applied to works of visual arts, such as the fair use doctrine and the use of generative artificial intelligence tools.

🎟️ Grab tickets using the link in our bio! 

#centerforartlaw #artlaw #legal #research #lawyer #artlawyer #bootcamp #copyright #CLE #trainingprogram
In order to fund acquisitions of contemporary art, In order to fund acquisitions of contemporary art, The Phillips Collection sold seven works of art from their collection at auction in November. The decision to deaccession three works in particular have led to turmoil within the museum's governing body. The works at the center of the controversy include Georgia O'Keefe's "Large Dark Red Leaves on White" (1972) which sold for $8 million, Arthur Dove's "Rose and Locust Stump" (1943), and "Clowns et pony" an 1883 drawing by Georges Seurat. Together, the three works raised $13 million. Three board members have resigned, while members of the Phillips family have publicly expressed concerns over the auctions. 

Those opposing the sales point out that the works in question were collected by the museum's founders, Duncan and Marjorie Phillips. While museums often deaccession works that are considered reiterative or lesser in comparison to others by the same artist, the works by O'Keefe, Dove, and Seurat are considered highly valuable, original works among the artist's respective oeuvres. 

The museum's director, Jonathan P. Binstock, has defended the sales, arguing that the process was thorough and reflects the majority interests of the collection's stewards. He believes that acquiring contemporary works will help the museum to evolve. Ultimately, the controversy highlights the difficulties of maintaining institutional collections amid conflicting perspectives.

🔗 Click the link in our bio to read more.
Make sure to check out our newest episode if you h Make sure to check out our newest episode if you haven’t yet!

Paris and Andrea get the change to speak with Patty Gerstenblith about how the role international courts, limits of accountability, and if law play to protect history in times of war.

🎙️ Click the link in our bio to listen anywhere you get your podcasts!
Alexander Butyagin, a Russian archaeologist, was a Alexander Butyagin, a Russian archaeologist, was arrested by Polish authorities in Warsaw. on December 4th. Butyagin is wanted by Ukraine for allegedly conducting illegal excavations of Myrmekion, an ancient city in Crimea. Located in present-day Crimea, Myrmekion was an Ancient Greek colony dating to the sixth century, BCE. 

According to Ukrainian officials, between 2014 and 2019 Butyagin destroyed parts of the Myrmekion archaeological site while serving as head of Ancient Archaeology of the Northern Black Sea region at St. Petersburg's Hermitage Museum. The resulting damages are estimated at $4.7 million. Notably, Russia's foreign ministry has denounced the arrest, describing Poland's cooperation with Ukraine's extradition order as "legal tyranny." Russia invaded and annexed Crimea in 2014.

🔗 Read more by clicking the link in our bio

#centerforartlaw #artlaw #artcrime #artlooting #ukraine #crimea
Join us on February 18th to learn about the proven Join us on February 18th to learn about the provenance and restitution of the Cranach painting at the North Carolina Museum of Art.

A beloved Cranach painting at the North Carolina Museum of Art was accused of being looted by the Nazis. Professor Deborah Gerhardt will describe the issues at stake and the evidentiary trail that led to an unusual model for resolving the dispute.

Grab your tickets today using the link in our bio!

#centerforartlaw #artlaw #legal #legalresearch #museumissues #artwork
“In the depth of winter, I finally learned that wi “In the depth of winter, I finally learned that within me there lay an invincible summer."
~ Albert Camus, "Return to Tipasa" (1952) 

Camus is on our reading list but for now, stay close to the ground to avoid the deorbit burn from the 2026 news and know that we all contain invincible summer. 

The Center for Art Law's January 2026 Newsletter is here—catch up on the latest in art law and start the year informed.
https://itsartlaw.org/newsletters/january-newsletter-which-way-is-up/ 

#centerforartlaw #artlaw #lawyer #artlawyer #legalresearch #legal #art #law #newsletter #january
Major corporations increasingly rely on original c Major corporations increasingly rely on original creative work to train AI models, often claiming a fair use defense. However, many have flagged this interpretation of copyright law as illegitimate and exploitative of artists. In July, the Senate Judiciary Committee on Crime and Counterterrorism addressed these issues in a hearing on copyright law and AI training. 

Read our recent article by Katelyn Wang to learn more about the connection between AI training, copyright protections, and national security. 

🔗 Click the link in our bio to read more!
Join the Center for Art Law for an in-person, all- Join the Center for Art Law for an in-person, all-day  CLE program to train lawyers to work with visual artists and their unique copyright needs. The bootcamp will be led by veteran art law attorneys specializing in copyright law.

This Bootcamp provides participants -- attorneys, law students, law graduates and legal professionals -- with foundational legal knowledge related to copyright law for art market clients. Through a combination of instructional presentations and mock consultations, participants will gain a solid foundation in copyright law and its specificities as applied to works of visual arts, such as the fair use doctrine and the use of generative artificial intelligence tools. 

🎟️ Grab tickets using the link in our bio!
Our interns do the most. Check out a day in the li Our interns do the most. Check out a day in the life of Lauren Stein, a 2L at Wake Forest, as she crushes everything in her path. 

Want to help us foster more great minds? Donate to Center for Art Law.

🔗 Click the link below to donate today!

https://itsartlaw.org/donations/new-years-giving-tree/ 

#centerforartlaw #artlaw #legal #legalresearch #caselaw #lawyer #art #lawstudent #internships #artlawinternship
Paul Cassier (1871-1926 was an influential Jewish Paul Cassier (1871-1926 was an influential Jewish art dealer. He owned and ran an art gallery called Kunstsalon Paul Cassirer along with his cousin. He is known for his role in promoting the work of impressionists and modernists like van Gogh and Cézanne. 

Cassier was seen as a visionary and risk-tasker. He gave many now famous artists their first showings in Germany including van Gogh, Manet, and Gaugin. Cassier was specifically influential to van Gogh's work as this first showing launched van Gogh's European career.

🔗 Learn more about the impact of his career by checking out the link in our bio!

#centerforartlaw #artlaw #legalresearch #law #lawyer #artlawyer #artgallery #vangogh
No strike designations for cultural heritage are o No strike designations for cultural heritage are one mechanism by which countries seek to uphold the requirements of the 1954 Hague Convention. As such, they are designed to be key instruments in protecting the listed sites from war crimes. Yet not all countries maintain such inventories of their own whether due to a lack of resources, political views about what should be represented, or the risk of misuse and abuse. This often places the onus on other governments to create lists about cultures other than their own during conflicts. Thus, there may be different lists compiled by different governments in a conflict, creating an unclear legal landscape for determining potential war crimes and raising significant questions about the effectiveness of no strikes as a protection mechanism. 

This presentation discusses current research seeking to empirically evaluate the effectiveness of no strike designations as a protection mechanism against war crimes in Syria. Using data on cultural heritage attacks from the height of the Syrian Conflict (2014-2017) compiled from open sources, a no strike list completed in approximately 2012, and measures of underlying risk, this research asks whether the designations served as a protective factor or a risk factor for a given site and the surrounding area. Results and implications for holding countries accountable for war crimes against cultural heritage are discussed. 

🎟️ Grab your tickets using the link in our bio!

#centerforartlaw #artlaw #artlawyer #legalresearch #lawyer #culturalheritage #art #protection
What happens when culture becomes collateral damag What happens when culture becomes collateral damage in war?
In this episode of Art in Brief, we speak with Patty Gerstenblith, a leading expert on cultural heritage law, about the destruction of cultural sites in recent armed conflicts.

We examine the role of international courts, the limits of accountability, and whether the law can truly protect history in times of war.

We would like to also thank Rebecca Bennett for all of her help on this episode. 

 🎙️ Click the link in our bio to listen anywhere you get your podcasts.

#centerforartlaw #artlaw #legalresearch #artlawyer #lawyer #podcast #artpodcast #culturalheritage #armedconflict #internationallaw
  • About the Center
  • Contact Us
  • Newsletter
  • Upcoming Events
  • Internship
  • Case Law Database
  • Log in
  • Become a Member
  • Donate
DISCLAIMER

Center for Art Law is a New York State non-profit fully qualified under provision 501(c)(3)
of the Internal Revenue Code.

The Center does not provide legal representation. Information available on this website is
purely for educational purposes only and should not be construed as legal advice.

TERMS OF USE AND PRIVACY POLICY

Your use of the Site (as defined below) constitutes your consent to this Agreement. Please
read our Terms of Use and Privacy Policy carefully.

© 2026 Center for Art Law
 

Loading Comments...
 

You must be logged in to post a comment.