• About
    • Mission
    • Team
    • Boards
    • Mentions & Testimonials
    • Institutional Recognition
    • Annual Reports
    • Current & Past Sponsors
    • Contact Us
  • Resources
    • Article Collection
    • Podcast: Art in Brief
    • AML and the Art Market
    • AI and Art Authentication
    • Newsletter
      • Subscribe
      • Archives
      • In Brief
    • Art Law Library
    • Movies
    • Nazi-looted Art Restitution Database
    • Global Network
      • Courses and Programs
      • Artists’ Assistance
      • Bar Associations
      • Legal Sources
      • Law Firms
      • Student Societies
      • Research Institutions
    • Additional resources
      • The “Interview” Project
  • Events
    • Worldwide Calendar
    • Our Events
      • All Events
      • Annual Conferences
        • 2025 Art Law Conference
        • 2024 Art Law Conference
        • 2023 Art Law Conference
        • 2022 Art Law Conference
        • 2015 Art Law Conference
  • Programs
    • Visual Artists’ Legal Clinics
      • Art & Copyright Law Clinic
      • Artist-Dealer Relationships Clinic
      • Artist Legacy and Estate Planning Clinic
      • Visual Artists’ Immigration Clinic
    • Summer School
      • 2026
      • 2025
    • Internship and Fellowship
    • Judith Bresler Fellowship
  • Case Law Database
  • Log in
  • Become a Member
  • Donate
  • Log in
  • Become a Member
  • Donate
Center for Art Law
  • About
    About
    • Mission
    • Team
    • Boards
    • Mentions & Testimonials
    • Institutional Recognition
    • Annual Reports
    • Current & Past Sponsors
    • Contact Us
  • Resources
    Resources
    • Article Collection
    • Podcast: Art in Brief
    • AML and the Art Market
    • AI and Art Authentication
    • Newsletter
      Newsletter
      • Subscribe
      • Archives
      • In Brief
    • Art Law Library
    • Movies
    • Nazi-looted Art Restitution Database
    • Global Network
      Global Network
      • Courses and Programs
      • Artists’ Assistance
      • Bar Associations
      • Legal Sources
      • Law Firms
      • Student Societies
      • Research Institutions
    • Additional resources
      Additional resources
      • The “Interview” Project
  • Events
    Events
    • Worldwide Calendar
    • Our Events
      Our Events
      • All Events
      • Annual Conferences
        Annual Conferences
        • 2025 Art Law Conference
        • 2024 Art Law Conference
        • 2023 Art Law Conference
        • 2022 Art Law Conference
        • 2015 Art Law Conference
  • Programs
    Programs
    • Visual Artists’ Legal Clinics
      Visual Artists’ Legal Clinics
      • Art & Copyright Law Clinic
      • Artist-Dealer Relationships Clinic
      • Artist Legacy and Estate Planning Clinic
      • Visual Artists’ Immigration Clinic
    • Summer School
      Summer School
      • 2026
      • 2025
    • Internship and Fellowship
    • Judith Bresler Fellowship
  • Case Law Database
Home image/svg+xml 2021 Timothée Giet Art law image/svg+xml 2021 Timothée Giet Can AI Tell the Truth, the Whole Truth, and Nothing But the Truth? The Courts Aren’t Sure
Back

Can AI Tell the Truth, the Whole Truth, and Nothing But the Truth? The Courts Aren’t Sure

November 14, 2025

Center for Art Law AI tell truth

By Rebecca Bennett

As artificial intelligence (AI) becomes an increasingly ubiquitous presence across virtually every industry, legal systems are forced to grapple with the implications of this technology seeping into courtroom proceedings. The legal system plays a significant role in setting standards for ethical conduct surrounding evolving technologies, like AI. And while there have already been numerous relevant cases addressing complaints related to the technology — such as courts fining attorneys for using AI hallucinated content, and the ongoing legal battle between Thomson Reuters and Ross Intelligence, where AI is the crux of the dispute— AI-generated evidence is becoming increasingly common in disputes seemingly unrelated to the technology itself.[1][2] For instance, ChatGPT was reportedly used to identify the perpetrator accused of starting the Pacific Palisades fire in Los Angeles last January.[3] Yet, as AI evidence enters the courtroom, many professionals are concerned that existing legal frameworks are not prepared to handle the significant challenges posed by the technology’s ability to generate highly realistic falsified content.[4]

These issues are particularly relevant to the visual arts for a number of reasons. First, artists are already involved in significant copyright lawsuits against AI companies. For example, Sarah Andersen, Kelly McKernan, and Karla Ortiz sued Stability AI, Midjourney and Deviant Art in 2023 over the use of their works to train AI models.[5] However, beyond lawsuits directly probing boundaries of what constitutes permissible and impermissible use of human-generated content in AI model training, AI systems are increasingly recognized for their potential to support authentication and heritage conservation efforts.[6][7] Art authentication stands to benefit from the integration of AI methods, given that the field currently relies on placing high levels of trust in highly specialized human experts. The subjective nature of these analyses means two different experts may come to different conclusions regarding the authenticity of a work, or that highly skilled forgers can succeed in deceiving multiple experts. However, researchers have succeeded in developing AI tools that can reliably distinguish between authentic and forged works, when extensively trained.[8] As a result, AI-generated evidence may be increasingly called upon to provide additional expertise or corroborate the reports of human authentication experts in legal disputes.

Traditionally suspicious, the art market remains wary of displacing the connoisseurship of human professionals in favor of technological alternatives.[9] Similar concerns are prevalent in the legal field. Currently, the United States’ judiciary is adapting to AI’s entrance into the courtroom. As AI’s capabilities and potential applications rapidly evolve, ethical debates have encouraged the court to solidify verification procedures and guidelines for judges and juries.

Ethical Concerns

As an evidentiary tool, AI raises a multitude of ethical quandaries. In order to handle the inevitable influx of AI-generated evidence, courts must prepare themselves to balance potential benefits of the emerging technology with its risks. This is especially prudent in the context of jury trials, due to the potential for generative AI products to produce extraordinarily realistic false information.[10] Fears of deepfakes are not unfounded, as a 2021 study by researchers at the University of Amsterdam demonstrates that people cannot reliably identify falsified content.[11] Such incidents are common, as evidenced by television host Chris Cuomo’s recent outrage over a falsified video of US Representative Alexandria Ocasio-Cortez.[12] Although the video displayed a watermark indicating AI was used to create it, Cuomo took to the internet to bash Ocasio-Cortez for the opinions her falsified image expressed in the video.[13]

Unfortunately, AI technology designed to detect AI-generated content remains unreliable, creating a difficult paradox for legal professionals.[14] Professor Maura P. Grossman, a leading researcher investigating the integration of AI in the legal system argues it is paramount that courts respond proactively to these issues, because audiovisual evidence is much more memorable than, for example, verbal or written testimony.[15] On the one hand, it is concerning that audiovisual evidence is likely to be perceived as reliable without further insight into the methods used to gather the evidence, however an overly cautious approach could also cause jurors to become too distrustful of the legal process.

Trust in the authority of evidence is critical, due to the phenomenon of defensive processing; once people accept that something is fake, it is impossible to recalibrate their perceptions.[16] In a 2019 article published by the California Law Review, professors Danielle Citron and Bobby Chesney introduced the now frequently cited “liar’s dividend,” a concept encompassing the danger that rising distrust will encourage claims of fakery to be unduly leveled at legitimate evidence.[17] Therefore, courts must carefully consider how they approach discussing the validity of AI-generated evidence, as maintaining a high level of trust in the courtroom is necessary to protect the ethical functioning of the legal process.

In order to combat these challenges, Grossman advocates an approach that encourages critical analysis without causing jurors to be overly skeptical of the evidence presented to them.[18] Here, she distinguishes between the challenges posed by evidence that is readily acknowledged by all parties to incorporate AI, and unacknowledged evidence where parties dispute the presence of manipulation.[19] Where, in her view, acknowledged evidence simply requires confirmation of its validity and reliability, the content of unacknowledged evidence must be proven to be genuine.[20]

In a webinar co-hosted by the National Center for State Courts and the Thomson Reuters Institute on August 20, 2025, assembled legal professionals outlined a series of measures courts could adopt as standard when faced with AI-generated evidence.[21] Ideally, they argue, any generative AI-evidence should be clearly acknowledged as such and accompanied by expert witness testimony speaking to the chain of conduct that led to the model’s findings.[22] These practices should be integrated throughout trial proceedings, from jury selection and instructions, to the trial itself. During selection, technological literacy and bias screenings could be conducted, while unambiguous plain language explanations and guidelines surrounding authenticity should be communicated during jury instructions.[23] While these suggestions are certainly prudent, it is also important to consider the existing legal frameworks designed to handle evidence verification.

Updates to the Federal Rules of Evidence

In response to the concerns outlined above, the federal courts’ advisory committee on evidence rules has acknowledged the need to update the Federal Rules of Evidence by adding specific provisions governing AI. Beginning in 2023, the committee debated amendments to Rule 901, which governs evidence authentication.[24] Rule 901 sets a low threshold for authenticity, generally assuming that evidence is derived from reliable sources.[25] Numerous proposals were considered, yet, in May of 2025 the committee ultimately chose not to adopt any amendments to rule 901.[26] The committee reasoned that acting on authenticity concerns may not be immediately necessary, given that the rules have proven capable in handling authenticity concerns regarding social media posts.[27] However, during the same session, the committee also considered a proposal to adopt a new rule, 707, aimed at addressing issues stemming from AI-evidence that is admitted without expert testimony.[28] Rule 707 was preliminarily accepted by the committee and released for public comment in August.[29] The rule states that in cases when “machine-generated evidence is offered without an expert witness and would be subject to Rule 702 if testified to by a witness, the court may admit the evidence only if it satisfies the requirements of Rule 702 (a)-(d).”[30] An exception is specified indicating that Rule 707 does not apply “to the output of simple scientific instruments.”[31]

If enacted, this rule would subject any machine-generated evidence to the same admissibility standards applied to expert testimony (Rule 702).[32] Under this framework, AI evidence would be held to the same standards of validity and reliability as a human expert, ideally increasing transparency regarding the process by which AI outputs are generated. This addresses many concerns raised by legal scholars by requiring litigants to clearly convey the methodology used to generate the evidence and how it is relevant to the case at hand. The proposed rule is open to public comment until February of 2026.[33]

Conclusion

Whether this rule will ultimately be enacted by the committee remains to be seen. And, while amendments to the Federal Rules of Evidence are an encouraging step, they should not be seen as an end to the discussion. The potency and novelty of AI technologies requires ongoing discussion and the adoption of flexible legal frameworks. Rigid regulations could easily become obsolete as the applications and capabilities of AI continue to expand, necessitating an attitude of flexibility and creativity from legal professionals. Rather than viewing these developments with pessimism, such an attitude acknowledges potential benefits while remaining cognizant of its consequences. Instituting safeguards against deepfakes and ensuring AI models are made comprehensible to all parties should bolster confidence in the legal process, rather than detracting from equity and transparency.

Art authentication is, as noted earlier, an area where incorporating AI analyses with human expert opinions could serve to increase confidence in findings. It is a search for truth. A clear parallel can be drawn between the skepticism common when discussing the value of AI-generated content and the art market’s attitude towards the subject of authentication. In both cases, trust in the intrinsic value of the object under scrutiny is paramount. A forgery, even a great one, is of lesser value due to the importance of genuine authorship and creativity in artistic production. Similarly, courts dealing with deepfaked evidence are understandably skeptical of allowing fully computer-generated materials to contribute to trial outcomes. Yet, the fact remains that whether the courts are ready or not, AI is permeating every aspect of society and an attitude of complacency and inaction is far more dangerous than taking measured, thoughtful steps towards managing its consequences.

Further Resources:

  1. George Washington University, AI Litigation Database
  2. Bruce Barcott, AI Lawsuits Worth Watching, TechPolicy.Press (July 1, 2024).
  3. Bobby Chesney & Danielle Citron, Deep Fakes: A Looming Challenge for Privacy, 107 Cal. L. Rev. 1753 (2019).

About the author:

Rebecca Bennett is a recent graduate of McGill University with a BA in Art History and International Development. Currently interning with the Center as a graduate intern, she is working to pursue a career in Art Law.

Select References:

  1. Thomson Reuters Enterprise Centre GmbH v. Ross Intelligence Inc., No. 1:20-CV-613-SB (D. Del. Feb. 11, 2025). ↑
  2. Jaclyn Diaz, A Recent High-Profile Case of AI Hallucination Serves as a Stark Warning, NPR NEWS (July 10, 2025). ↑
  3. Ana Faguy and Nardine Saad, ChatGPT Image Snares Suspect in Deadly Pacific Palisades Fire, BBC NEWS (October 8, 2025). ↑
  4. Natalie Runyon, AI Evidence in Jury Trials: Navigating the New Frontier of Justice THOMSON REUTERS (October 6, 2025). ↑
  5. Andersen v. Stability AI Ltd., No. 23-cv-00201-WHO (LJC), 2025 U.S. Dist. LEXIS 50848 (N.D. Cal. Mar. 19, 2025). ↑
  6. Shelby Jorgensen, How to Catch a Criminal in the 21st Century and Why AI Might be Able to Help, the Center for Art Law (August 3, 2025). ↑
  7. J.H. Smith, C. Holt, N.H. Smith & R.P. Taylor, Using Machine Learning to Distinguish Between Authentic and Imitation Jackson Pollock Poured Paintings: A Tile-Driven Approach to Computer Vision, 19 PLOS ONE e0302962 (2024). ↑
  8. Sandro Boccuzzo, Deborah Desirée Meyer & Ludovica Schaerf, Art Forgery Detection Using Kolmogorov Arnold and Convolutional Neural Networks, in European Conference on Computer Vision 187 (Springer Nature Switzerland 2024). ↑
  9. George Nelson, AI is Trying to Take Over Art Authentication, But Longtime Experts Are Skeptical, ARTNews (August 30, 2024). ↑
  10. Dalal, Abhishek, et. al., Deepfakes in Court: How Judges Can Proactively Manage Alleged AI-Generated Material in National Security Cases. University of Chicago Legal Forum (2024). ↑
  11. N.C. Köbis, B. Doležalová & I. Soraperra, Fooled Twice: People Cannot Detect Deepfakes but Think They Can, 24 iScience 103364 (2021). ↑
  12. Michael Sainato, Chris Cuomo mocked for response after falling for deepfake AOC video, The Guardian (August 7, 2025). ↑
  13. Id. Chris Cuomo mocked for response after falling for deepfake AOC video. ↑
  14. Stuart A. Thompson and Tiffany Hsu, How Easy Is It to Fool A.I.-Detection Tools?, The New York Times (June 28, 2023). ↑
  15. Id. Deepfakes in Court: How Judges Can Proactively Manage Alleged AI-Generated Material in National Security Cases. ↑
  16. Thomson Reuters Institute/National Center for State Courts, AI Evidence in Jury Trials: Authenticity, Admissibility, and the Role of the Court and Juries, Vimeo (August 20, 2025). ↑
  17. Bobby Chesney & Danielle Citron, Deep Fakes: A Looming Challenge for Privacy, 107 Cal. L. Rev. 1753 (2019). ↑
  18. University of Waterloo, Generative AI and the Legal System (April 16, 2024). ↑
  19. Id. https://www.thomsonreuters.com/en-us/posts/ai-in-courts/ai-evidence-trials/ ↑
  20. Thomson Reuters Institute/National Center for State Courts, AI Evidence in Jury Trials: Authenticity, Admissibility, and the Role of the Court and Juries, Vimeo (August 20, 2025). ↑
  21. Id. https://vimeo.com/showcase/11715086?video=1112900955 ↑
  22. Id. https://vimeo.com/showcase/11715086?video=1112900955 ↑
  23. Id. https://vimeo.com/showcase/11715086?video=1112900955 ↑
  24. Fed. R. Evid. 901. ; Riana Pfefferkorn, The Ongoing Fight to Keep Evidence Intact in the Face of AI Deception, TechPolicy.Press (August 14, 2025). ↑
  25. Id. Deepfakes in Court: How Judges Can Proactively Manage Alleged AI-Generated Material in National Security Cases. ↑
  26. US Courts, Advisory Committee on Evidence Rules-May 2025, Agenda Book (May 2, 2025). ↑
  27. Avi Gesser, Matt Kelly, Gabriel A. Kohan, and Jim Pastore, Federal Judicial Conference to Revise Rules of Evidence to Address AI Risks, Debevoise and Plimpton (March 20, 2025). ↑
  28. US Courts, Preliminary Draft of Proposed Amendments to the Federal Rules of Evidence, (August 13, 2025). ↑
  29. US Courts, Proposed Amendments Published for Public Comment, (August 15, 2025). ↑
  30. Id. US Courts, Preliminary Draft of Proposed Amendments to the Federal Rules of Evidence, (August 13, 2025). ↑
  31. Id. US Courts, Preliminary Draft of Proposed Amendments to the Federal Rules of Evidence, (August 13, 2025). ↑
  32. Fed. R. Evid. 702. ↑
  33. US Courts, Proposed Amendments Published for Public Comment, (August 15, 2025). ↑

 

Disclaimer: This article is for educational purposes only and is not meant to provide legal advice. Readers should not construe or rely on any comment or statement in this article as legal advice. For legal advice, readers should seek a consultation with an attorney.

Post navigation

Previous Don’t Blame Me: How the Art Market Battles Forgeries
Next Power of “x”: Legal Questions and Possibilities of Artist x Brand Collaborations

Related Posts

Credit: Pat Whelan, red blue and yellow intermodal containers, 2020.

The Price of Expression: U.S. Tariff Policy and the International Art Market

July 17, 2025
Citation: “MUNDARA KOORANG (Thunder Snake)” by Novyaradnum,CC BY-SA 3.0. (https://commons.wikimedia.org/wiki/File:Gra_paper2.jpg ).

Safeguarding Traditional Knowledge and Traditional Cultural Expression Through Intellectual Property Systems

February 1, 2023

Inspiration or Infringement: Gordon v McGinley

August 21, 2011
Center for Art Law
Center for Art Law

Follow us on Instagram for the latest in Art Law!

The expansion of the use of collaborations between The expansion of the use of collaborations between artists and major consumer corporations brings along a myriad of IP legal considerations. What was once seen in advertisement initiatives  has developed into the creation of "art objects," something that lives within a consumer object while retaining some portion of an artists work. 

🔗 Read more about this interesting interplay in Natalie Kawam Yang's published article, including a discussion on how the LOEWE x Ghibli Museum fits into this context, using the link in our bio.
We can't wait for you to join us on February 4th! We can't wait for you to join us on February 4th!  Check out the full event description below:

Join the Center for Art Law for an in-person, full-day training aimed at preparing lawyers for working with art market participants and understanding their unique copyright law needs. The bootcamp will be led by veteran art law attorneys, Louise Carron, Barry Werbin, Carol J. Steinberg, Esq., Scott Sholder, Marc Misthal, specialists in copyright law. 

This Bootcamp provides participants -- attorneys, law students, law graduates and legal professionals -- with foundational legal knowledge related to copyright law for art market clients. Through a combination of instructional presentations and mock consultations, participants will gain a solid foundation in copyright law and its specificities as applied to works of visual arts, such as the fair use doctrine and the use of generative artificial intelligence tools.

🎟️ Grab tickets using the link in our bio!
Don't forget to grab tickets to our upcoming Collo Don't forget to grab tickets to our upcoming Colloquium, discussing the effectiveness of no strike designations in Syria, on February 2nd. Check out the full event description below:

No strike designations for cultural heritage are one mechanism by which countries seek to uphold the requirements of the 1954 Hague Convention. As such, they are designed to be key instruments in protecting the listed sites from war crimes. Yet not all countries maintain such inventories of their own whether due to a lack of resources, political views about what should be represented, or the risk of misuse and abuse. This often places the onus on other governments to create lists about cultures other than their own during conflicts. Thus, there may be different lists compiled by different governments in a conflict, creating an unclear legal landscape for determining potential war crimes and raising significant questions about the effectiveness of no strikes as a protection mechanism. 

Michelle Fabiani will discuss current research seeking to empirically evaluate the effectiveness of no strike designations as a protection mechanism against war crimes in Syria. Using data on cultural heritage attacks from the height of the Syrian Conflict (2014-2017) compiled from open sources, a no strike list completed in approximately 2012, and measures of underlying risk, this research asks whether the designations served as a protective factor or a risk factor for a given site and the surrounding area. Results and implications for holding countries accountable for war crimes against cultural heritage are discussed. 

🎟️ Grab tickets using the link in our bio!

#centerforartlaw #artlaw #culturalheritage #lawyer #legalreserach #artlawyer
Don't miss our up coming in-person, full-day train Don't miss our up coming in-person, full-day training aimed at preparing lawyers for working with art market participants and understanding their unique copyright law needs. The bootcamp will be led by veteran art law attorneys, Louise Carron, Barry Werbin, Carol J. Steinberg, Esq., Scott Sholder, Marc Misthal, specialists in copyright law. 

This Bootcamp provides participants -- attorneys, law students, law graduates and legal professionals -- with foundational legal knowledge related to copyright law for art market clients. Through a combination of instructional presentations and mock consultations, participants will gain a solid foundation in copyright law and its specificities as applied to works of visual arts, such as the fair use doctrine and the use of generative artificial intelligence tools.

🎟️ Grab tickets using the link in our bio! 

#centerforartlaw #artlaw #legal #research #lawyer #artlawyer #bootcamp #copyright #CLE #trainingprogram
In order to fund acquisitions of contemporary art, In order to fund acquisitions of contemporary art, The Phillips Collection sold seven works of art from their collection at auction in November. The decision to deaccession three works in particular have led to turmoil within the museum's governing body. The works at the center of the controversy include Georgia O'Keefe's "Large Dark Red Leaves on White" (1972) which sold for $8 million, Arthur Dove's "Rose and Locust Stump" (1943), and "Clowns et pony" an 1883 drawing by Georges Seurat. Together, the three works raised $13 million. Three board members have resigned, while members of the Phillips family have publicly expressed concerns over the auctions. 

Those opposing the sales point out that the works in question were collected by the museum's founders, Duncan and Marjorie Phillips. While museums often deaccession works that are considered reiterative or lesser in comparison to others by the same artist, the works by O'Keefe, Dove, and Seurat are considered highly valuable, original works among the artist's respective oeuvres. 

The museum's director, Jonathan P. Binstock, has defended the sales, arguing that the process was thorough and reflects the majority interests of the collection's stewards. He believes that acquiring contemporary works will help the museum to evolve. Ultimately, the controversy highlights the difficulties of maintaining institutional collections amid conflicting perspectives.

🔗 Click the link in our bio to read more.
Make sure to check out our newest episode if you h Make sure to check out our newest episode if you haven’t yet!

Paris and Andrea get the change to speak with Patty Gerstenblith about how the role international courts, limits of accountability, and if law play to protect history in times of war.

🎙️ Click the link in our bio to listen anywhere you get your podcasts!
Alexander Butyagin, a Russian archaeologist, was a Alexander Butyagin, a Russian archaeologist, was arrested by Polish authorities in Warsaw. on December 4th. Butyagin is wanted by Ukraine for allegedly conducting illegal excavations of Myrmekion, an ancient city in Crimea. Located in present-day Crimea, Myrmekion was an Ancient Greek colony dating to the sixth century, BCE. 

According to Ukrainian officials, between 2014 and 2019 Butyagin destroyed parts of the Myrmekion archaeological site while serving as head of Ancient Archaeology of the Northern Black Sea region at St. Petersburg's Hermitage Museum. The resulting damages are estimated at $4.7 million. Notably, Russia's foreign ministry has denounced the arrest, describing Poland's cooperation with Ukraine's extradition order as "legal tyranny." Russia invaded and annexed Crimea in 2014.

🔗 Read more by clicking the link in our bio

#centerforartlaw #artlaw #artcrime #artlooting #ukraine #crimea
Join us on February 18th to learn about the proven Join us on February 18th to learn about the provenance and restitution of the Cranach painting at the North Carolina Museum of Art.

A beloved Cranach painting at the North Carolina Museum of Art was accused of being looted by the Nazis. Professor Deborah Gerhardt will describe the issues at stake and the evidentiary trail that led to an unusual model for resolving the dispute.

Grab your tickets today using the link in our bio!

#centerforartlaw #artlaw #legal #legalresearch #museumissues #artwork
“In the depth of winter, I finally learned that wi “In the depth of winter, I finally learned that within me there lay an invincible summer."
~ Albert Camus, "Return to Tipasa" (1952) 

Camus is on our reading list but for now, stay close to the ground to avoid the deorbit burn from the 2026 news and know that we all contain invincible summer. 

The Center for Art Law's January 2026 Newsletter is here—catch up on the latest in art law and start the year informed.
https://itsartlaw.org/newsletters/january-newsletter-which-way-is-up/ 

#centerforartlaw #artlaw #lawyer #artlawyer #legalresearch #legal #art #law #newsletter #january
Major corporations increasingly rely on original c Major corporations increasingly rely on original creative work to train AI models, often claiming a fair use defense. However, many have flagged this interpretation of copyright law as illegitimate and exploitative of artists. In July, the Senate Judiciary Committee on Crime and Counterterrorism addressed these issues in a hearing on copyright law and AI training. 

Read our recent article by Katelyn Wang to learn more about the connection between AI training, copyright protections, and national security. 

🔗 Click the link in our bio to read more!
Join the Center for Art Law for an in-person, all- Join the Center for Art Law for an in-person, all-day  CLE program to train lawyers to work with visual artists and their unique copyright needs. The bootcamp will be led by veteran art law attorneys specializing in copyright law.

This Bootcamp provides participants -- attorneys, law students, law graduates and legal professionals -- with foundational legal knowledge related to copyright law for art market clients. Through a combination of instructional presentations and mock consultations, participants will gain a solid foundation in copyright law and its specificities as applied to works of visual arts, such as the fair use doctrine and the use of generative artificial intelligence tools. 

🎟️ Grab tickets using the link in our bio!
Our interns do the most. Check out a day in the li Our interns do the most. Check out a day in the life of Lauren Stein, a 2L at Wake Forest, as she crushes everything in her path. 

Want to help us foster more great minds? Donate to Center for Art Law.

🔗 Click the link below to donate today!

https://itsartlaw.org/donations/new-years-giving-tree/ 

#centerforartlaw #artlaw #legal #legalresearch #caselaw #lawyer #art #lawstudent #internships #artlawinternship
  • About the Center
  • Contact Us
  • Newsletter
  • Upcoming Events
  • Internship
  • Case Law Database
  • Log in
  • Become a Member
  • Donate
DISCLAIMER

Center for Art Law is a New York State non-profit fully qualified under provision 501(c)(3)
of the Internal Revenue Code.

The Center does not provide legal representation. Information available on this website is
purely for educational purposes only and should not be construed as legal advice.

TERMS OF USE AND PRIVACY POLICY

Your use of the Site (as defined below) constitutes your consent to this Agreement. Please
read our Terms of Use and Privacy Policy carefully.

© 2026 Center for Art Law