• About
    • Mission
    • Team
    • Boards
    • Mentions & Testimonials
    • Institutional Recognition
    • Annual Reports
    • Current & Past Sponsors
    • Contact Us
  • Resources
    • Article Collection
    • Podcast: Art in Brief
    • AML and the Art Market
    • AI and Art Authentication
    • Newsletter
      • Subscribe
      • Archives
      • In Brief
    • Art Law Library
    • Movies
    • Nazi-looted Art Restitution Database
    • Global Network
      • Courses and Programs
      • Artists’ Assistance
      • Bar Associations
      • Legal Sources
      • Law Firms
      • Student Societies
      • Research Institutions
    • Additional resources
      • The “Interview” Project
  • Events
    • Worldwide Calendar
    • Our Events
      • All Events
      • Annual Conferences
        • 2025 Art Law Conference
        • 2024 Art Law Conference
        • 2023 Art Law Conference
        • 2022 Art Law Conference
        • 2015 Art Law Conference
  • Programs
    • Visual Artists’ Legal Clinics
      • Art & Copyright Law Clinic
      • Artist-Dealer Relationships Clinic
      • Artist Legacy and Estate Planning Clinic
      • Visual Artists’ Immigration Clinic
    • Summer School
      • 2025
    • Internship and Fellowship
    • Judith Bresler Fellowship
  • Case Law Database
  • 2025 Year-End Appeal
  • Log in
  • Become a Member
  • Donate
  • 2025 Year-End Appeal
  • Log in
  • Become a Member
  • Donate
Center for Art Law
  • About
    About
    • Mission
    • Team
    • Boards
    • Mentions & Testimonials
    • Institutional Recognition
    • Annual Reports
    • Current & Past Sponsors
    • Contact Us
  • Resources
    Resources
    • Article Collection
    • Podcast: Art in Brief
    • AML and the Art Market
    • AI and Art Authentication
    • Newsletter
      Newsletter
      • Subscribe
      • Archives
      • In Brief
    • Art Law Library
    • Movies
    • Nazi-looted Art Restitution Database
    • Global Network
      Global Network
      • Courses and Programs
      • Artists’ Assistance
      • Bar Associations
      • Legal Sources
      • Law Firms
      • Student Societies
      • Research Institutions
    • Additional resources
      Additional resources
      • The “Interview” Project
  • Events
    Events
    • Worldwide Calendar
    • Our Events
      Our Events
      • All Events
      • Annual Conferences
        Annual Conferences
        • 2025 Art Law Conference
        • 2024 Art Law Conference
        • 2023 Art Law Conference
        • 2022 Art Law Conference
        • 2015 Art Law Conference
  • Programs
    Programs
    • Visual Artists’ Legal Clinics
      Visual Artists’ Legal Clinics
      • Art & Copyright Law Clinic
      • Artist-Dealer Relationships Clinic
      • Artist Legacy and Estate Planning Clinic
      • Visual Artists’ Immigration Clinic
    • Summer School
      Summer School
      • 2025
    • Internship and Fellowship
    • Judith Bresler Fellowship
  • Case Law Database
Home image/svg+xml 2021 Timothée Giet Art law image/svg+xml 2021 Timothée Giet Can AI Tell the Truth, the Whole Truth, and Nothing But the Truth? The Courts Aren’t Sure
Back

Can AI Tell the Truth, the Whole Truth, and Nothing But the Truth? The Courts Aren’t Sure

November 14, 2025

Center for Art Law AI tell truth

By Rebecca Bennett

As artificial intelligence (AI) becomes an increasingly ubiquitous presence across virtually every industry, legal systems are forced to grapple with the implications of this technology seeping into courtroom proceedings. The legal system plays a significant role in setting standards for ethical conduct surrounding evolving technologies, like AI. And while there have already been numerous relevant cases addressing complaints related to the technology — such as courts fining attorneys for using AI hallucinated content, and the ongoing legal battle between Thomson Reuters and Ross Intelligence, where AI is the crux of the dispute— AI-generated evidence is becoming increasingly common in disputes seemingly unrelated to the technology itself.[1][2] For instance, ChatGPT was reportedly used to identify the perpetrator accused of starting the Pacific Palisades fire in Los Angeles last January.[3] Yet, as AI evidence enters the courtroom, many professionals are concerned that existing legal frameworks are not prepared to handle the significant challenges posed by the technology’s ability to generate highly realistic falsified content.[4]

These issues are particularly relevant to the visual arts for a number of reasons. First, artists are already involved in significant copyright lawsuits against AI companies. For example, Sarah Andersen, Kelly McKernan, and Karla Ortiz sued Stability AI, Midjourney and Deviant Art in 2023 over the use of their works to train AI models.[5] However, beyond lawsuits directly probing boundaries of what constitutes permissible and impermissible use of human-generated content in AI model training, AI systems are increasingly recognized for their potential to support authentication and heritage conservation efforts.[6][7] Art authentication stands to benefit from the integration of AI methods, given that the field currently relies on placing high levels of trust in highly specialized human experts. The subjective nature of these analyses means two different experts may come to different conclusions regarding the authenticity of a work, or that highly skilled forgers can succeed in deceiving multiple experts. However, researchers have succeeded in developing AI tools that can reliably distinguish between authentic and forged works, when extensively trained.[8] As a result, AI-generated evidence may be increasingly called upon to provide additional expertise or corroborate the reports of human authentication experts in legal disputes.

Traditionally suspicious, the art market remains wary of displacing the connoisseurship of human professionals in favor of technological alternatives.[9] Similar concerns are prevalent in the legal field. Currently, the United States’ judiciary is adapting to AI’s entrance into the courtroom. As AI’s capabilities and potential applications rapidly evolve, ethical debates have encouraged the court to solidify verification procedures and guidelines for judges and juries.

Ethical Concerns

As an evidentiary tool, AI raises a multitude of ethical quandaries. In order to handle the inevitable influx of AI-generated evidence, courts must prepare themselves to balance potential benefits of the emerging technology with its risks. This is especially prudent in the context of jury trials, due to the potential for generative AI products to produce extraordinarily realistic false information.[10] Fears of deepfakes are not unfounded, as a 2021 study by researchers at the University of Amsterdam demonstrates that people cannot reliably identify falsified content.[11] Such incidents are common, as evidenced by television host Chris Cuomo’s recent outrage over a falsified video of US Representative Alexandria Ocasio-Cortez.[12] Although the video displayed a watermark indicating AI was used to create it, Cuomo took to the internet to bash Ocasio-Cortez for the opinions her falsified image expressed in the video.[13]

Unfortunately, AI technology designed to detect AI-generated content remains unreliable, creating a difficult paradox for legal professionals.[14] Professor Maura P. Grossman, a leading researcher investigating the integration of AI in the legal system argues it is paramount that courts respond proactively to these issues, because audiovisual evidence is much more memorable than, for example, verbal or written testimony.[15] On the one hand, it is concerning that audiovisual evidence is likely to be perceived as reliable without further insight into the methods used to gather the evidence, however an overly cautious approach could also cause jurors to become too distrustful of the legal process.

Trust in the authority of evidence is critical, due to the phenomenon of defensive processing; once people accept that something is fake, it is impossible to recalibrate their perceptions.[16] In a 2019 article published by the California Law Review, professors Danielle Citron and Bobby Chesney introduced the now frequently cited “liar’s dividend,” a concept encompassing the danger that rising distrust will encourage claims of fakery to be unduly leveled at legitimate evidence.[17] Therefore, courts must carefully consider how they approach discussing the validity of AI-generated evidence, as maintaining a high level of trust in the courtroom is necessary to protect the ethical functioning of the legal process.

In order to combat these challenges, Grossman advocates an approach that encourages critical analysis without causing jurors to be overly skeptical of the evidence presented to them.[18] Here, she distinguishes between the challenges posed by evidence that is readily acknowledged by all parties to incorporate AI, and unacknowledged evidence where parties dispute the presence of manipulation.[19] Where, in her view, acknowledged evidence simply requires confirmation of its validity and reliability, the content of unacknowledged evidence must be proven to be genuine.[20]

In a webinar co-hosted by the National Center for State Courts and the Thomson Reuters Institute on August 20, 2025, assembled legal professionals outlined a series of measures courts could adopt as standard when faced with AI-generated evidence.[21] Ideally, they argue, any generative AI-evidence should be clearly acknowledged as such and accompanied by expert witness testimony speaking to the chain of conduct that led to the model’s findings.[22] These practices should be integrated throughout trial proceedings, from jury selection and instructions, to the trial itself. During selection, technological literacy and bias screenings could be conducted, while unambiguous plain language explanations and guidelines surrounding authenticity should be communicated during jury instructions.[23] While these suggestions are certainly prudent, it is also important to consider the existing legal frameworks designed to handle evidence verification.

Updates to the Federal Rules of Evidence

In response to the concerns outlined above, the federal courts’ advisory committee on evidence rules has acknowledged the need to update the Federal Rules of Evidence by adding specific provisions governing AI. Beginning in 2023, the committee debated amendments to Rule 901, which governs evidence authentication.[24] Rule 901 sets a low threshold for authenticity, generally assuming that evidence is derived from reliable sources.[25] Numerous proposals were considered, yet, in May of 2025 the committee ultimately chose not to adopt any amendments to rule 901.[26] The committee reasoned that acting on authenticity concerns may not be immediately necessary, given that the rules have proven capable in handling authenticity concerns regarding social media posts.[27] However, during the same session, the committee also considered a proposal to adopt a new rule, 707, aimed at addressing issues stemming from AI-evidence that is admitted without expert testimony.[28] Rule 707 was preliminarily accepted by the committee and released for public comment in August.[29] The rule states that in cases when “machine-generated evidence is offered without an expert witness and would be subject to Rule 702 if testified to by a witness, the court may admit the evidence only if it satisfies the requirements of Rule 702 (a)-(d).”[30] An exception is specified indicating that Rule 707 does not apply “to the output of simple scientific instruments.”[31]

If enacted, this rule would subject any machine-generated evidence to the same admissibility standards applied to expert testimony (Rule 702).[32] Under this framework, AI evidence would be held to the same standards of validity and reliability as a human expert, ideally increasing transparency regarding the process by which AI outputs are generated. This addresses many concerns raised by legal scholars by requiring litigants to clearly convey the methodology used to generate the evidence and how it is relevant to the case at hand. The proposed rule is open to public comment until February of 2026.[33]

Conclusion

Whether this rule will ultimately be enacted by the committee remains to be seen. And, while amendments to the Federal Rules of Evidence are an encouraging step, they should not be seen as an end to the discussion. The potency and novelty of AI technologies requires ongoing discussion and the adoption of flexible legal frameworks. Rigid regulations could easily become obsolete as the applications and capabilities of AI continue to expand, necessitating an attitude of flexibility and creativity from legal professionals. Rather than viewing these developments with pessimism, such an attitude acknowledges potential benefits while remaining cognizant of its consequences. Instituting safeguards against deepfakes and ensuring AI models are made comprehensible to all parties should bolster confidence in the legal process, rather than detracting from equity and transparency.

Art authentication is, as noted earlier, an area where incorporating AI analyses with human expert opinions could serve to increase confidence in findings. It is a search for truth. A clear parallel can be drawn between the skepticism common when discussing the value of AI-generated content and the art market’s attitude towards the subject of authentication. In both cases, trust in the intrinsic value of the object under scrutiny is paramount. A forgery, even a great one, is of lesser value due to the importance of genuine authorship and creativity in artistic production. Similarly, courts dealing with deepfaked evidence are understandably skeptical of allowing fully computer-generated materials to contribute to trial outcomes. Yet, the fact remains that whether the courts are ready or not, AI is permeating every aspect of society and an attitude of complacency and inaction is far more dangerous than taking measured, thoughtful steps towards managing its consequences.

Further Resources:

  1. George Washington University, AI Litigation Database
  2. Bruce Barcott, AI Lawsuits Worth Watching, TechPolicy.Press (July 1, 2024).
  3. Bobby Chesney & Danielle Citron, Deep Fakes: A Looming Challenge for Privacy, 107 Cal. L. Rev. 1753 (2019).

About the author:

Rebecca Bennett is a recent graduate of McGill University with a BA in Art History and International Development. Currently interning with the Center as a graduate intern, she is working to pursue a career in Art Law.

Select References:

  1. Thomson Reuters Enterprise Centre GmbH v. Ross Intelligence Inc., No. 1:20-CV-613-SB (D. Del. Feb. 11, 2025). ↑
  2. Jaclyn Diaz, A Recent High-Profile Case of AI Hallucination Serves as a Stark Warning, NPR NEWS (July 10, 2025). ↑
  3. Ana Faguy and Nardine Saad, ChatGPT Image Snares Suspect in Deadly Pacific Palisades Fire, BBC NEWS (October 8, 2025). ↑
  4. Natalie Runyon, AI Evidence in Jury Trials: Navigating the New Frontier of Justice THOMSON REUTERS (October 6, 2025). ↑
  5. Andersen v. Stability AI Ltd., No. 23-cv-00201-WHO (LJC), 2025 U.S. Dist. LEXIS 50848 (N.D. Cal. Mar. 19, 2025). ↑
  6. Shelby Jorgensen, How to Catch a Criminal in the 21st Century and Why AI Might be Able to Help, the Center for Art Law (August 3, 2025). ↑
  7. J.H. Smith, C. Holt, N.H. Smith & R.P. Taylor, Using Machine Learning to Distinguish Between Authentic and Imitation Jackson Pollock Poured Paintings: A Tile-Driven Approach to Computer Vision, 19 PLOS ONE e0302962 (2024). ↑
  8. Sandro Boccuzzo, Deborah Desirée Meyer & Ludovica Schaerf, Art Forgery Detection Using Kolmogorov Arnold and Convolutional Neural Networks, in European Conference on Computer Vision 187 (Springer Nature Switzerland 2024). ↑
  9. George Nelson, AI is Trying to Take Over Art Authentication, But Longtime Experts Are Skeptical, ARTNews (August 30, 2024). ↑
  10. Dalal, Abhishek, et. al., Deepfakes in Court: How Judges Can Proactively Manage Alleged AI-Generated Material in National Security Cases. University of Chicago Legal Forum (2024). ↑
  11. N.C. Köbis, B. Doležalová & I. Soraperra, Fooled Twice: People Cannot Detect Deepfakes but Think They Can, 24 iScience 103364 (2021). ↑
  12. Michael Sainato, Chris Cuomo mocked for response after falling for deepfake AOC video, The Guardian (August 7, 2025). ↑
  13. Id. Chris Cuomo mocked for response after falling for deepfake AOC video. ↑
  14. Stuart A. Thompson and Tiffany Hsu, How Easy Is It to Fool A.I.-Detection Tools?, The New York Times (June 28, 2023). ↑
  15. Id. Deepfakes in Court: How Judges Can Proactively Manage Alleged AI-Generated Material in National Security Cases. ↑
  16. Thomson Reuters Institute/National Center for State Courts, AI Evidence in Jury Trials: Authenticity, Admissibility, and the Role of the Court and Juries, Vimeo (August 20, 2025). ↑
  17. Bobby Chesney & Danielle Citron, Deep Fakes: A Looming Challenge for Privacy, 107 Cal. L. Rev. 1753 (2019). ↑
  18. University of Waterloo, Generative AI and the Legal System (April 16, 2024). ↑
  19. Id. https://www.thomsonreuters.com/en-us/posts/ai-in-courts/ai-evidence-trials/ ↑
  20. Thomson Reuters Institute/National Center for State Courts, AI Evidence in Jury Trials: Authenticity, Admissibility, and the Role of the Court and Juries, Vimeo (August 20, 2025). ↑
  21. Id. https://vimeo.com/showcase/11715086?video=1112900955 ↑
  22. Id. https://vimeo.com/showcase/11715086?video=1112900955 ↑
  23. Id. https://vimeo.com/showcase/11715086?video=1112900955 ↑
  24. Fed. R. Evid. 901. ; Riana Pfefferkorn, The Ongoing Fight to Keep Evidence Intact in the Face of AI Deception, TechPolicy.Press (August 14, 2025). ↑
  25. Id. Deepfakes in Court: How Judges Can Proactively Manage Alleged AI-Generated Material in National Security Cases. ↑
  26. US Courts, Advisory Committee on Evidence Rules-May 2025, Agenda Book (May 2, 2025). ↑
  27. Avi Gesser, Matt Kelly, Gabriel A. Kohan, and Jim Pastore, Federal Judicial Conference to Revise Rules of Evidence to Address AI Risks, Debevoise and Plimpton (March 20, 2025). ↑
  28. US Courts, Preliminary Draft of Proposed Amendments to the Federal Rules of Evidence, (August 13, 2025). ↑
  29. US Courts, Proposed Amendments Published for Public Comment, (August 15, 2025). ↑
  30. Id. US Courts, Preliminary Draft of Proposed Amendments to the Federal Rules of Evidence, (August 13, 2025). ↑
  31. Id. US Courts, Preliminary Draft of Proposed Amendments to the Federal Rules of Evidence, (August 13, 2025). ↑
  32. Fed. R. Evid. 702. ↑
  33. US Courts, Proposed Amendments Published for Public Comment, (August 15, 2025). ↑

 

Disclaimer: This article is for educational purposes only and is not meant to provide legal advice. Readers should not construe or rely on any comment or statement in this article as legal advice. For legal advice, readers should seek a consultation with an attorney.

Post navigation

Previous Don’t Blame Me: How the Art Market Battles Forgeries
Next Power of “x”: Legal Questions and Possibilities of Artist x Brand Collaborations

Related Posts

Book Review: “Art Law and the Business of Art” (2019)

January 7, 2020

The Recovery of a Henri Matisse is a Victory for the Art Loss Register and a Reminder that Art Theft is Increasing

January 12, 2013

Google Now Facing Visual Artists

April 8, 2010
Center for Art Law
A Gift for You

A Gift for You

this Holiday Season

Celebrate the holidays with 20% off your annual subscription — claim your gift now!

 

Get your Subscription Today!
Guidelines AI and Art Authentication

AI and Art Authentication

Explore the new Guidelines for AI and Art Authentication for the responsible, ethical, and transparent use of artificial intelligence.

Download here
Center for Art Law

Follow us on Instagram for the latest in Art Law!

While arts funding is perpetually scarce, cultural While arts funding is perpetually scarce, cultural heritage institutions particularly struggle during and after armed conflict. In such circumstances, funds from a variety of sources including NGOs, international organizations, national and regional institutions, and private funds all play a crucial role in protecting cultural heritage. 

Read our new article by Andrew Dearman to learn more about the organizations funding emergency cultural heritage protection in the face of armed conflict, as well as the factors hindering effective responses. 

🔗 Click the link in our bio to read more! 

#centerforartlaw #artlaw #legalresearch #lawyer #artlawyer #culturalheritage #armedconflict #UNESCO
Join the Center for Art Law in welcoming Attorney Join the Center for Art Law in welcoming Attorney and Art Business Consultant Richard Lehun as our keynote speaker for our upcoming Artist Dealer Relationships Clinic. 

The Artist-Dealer Relationships Clinic helps artists and gallerists negotiate effective and mutually-beneficial contracts. By connecting artists and dealers to attorneys, this Clinic looks to forge meaningful relations and to provide a platform for artists and dealers to learn about the laws that govern their relationship, as well as have their questions addressed by experts in the field.

After a short lecture, attendees with consultation tickets will be paired with a volunteer attorney for a confidential 20-minute consultation. Limited slots are available for the consultation sessions.
Today we held our last advisory meeting of the yea Today we held our last advisory meeting of the year, a hybrid, and a good wrap to a busy season. What do you think we discussed?
We are incredibly grateful to our network of attor We are incredibly grateful to our network of attorneys who generously volunteer for our clinics! We could not do it without them! 

Next week, join the Center for Art Law for our Artist-Dealer Relationships Clinic. This clinic is focused on helping artists navigate and understand contracts with galleries and art dealers. After a short lecture, attendees with consultation tickets will be paired with one of the Center's volunteer attorneys for a confidential 20-minute consultation. Limited slots are available for the consultation sessions.
'twas cold and still in Brooklyn last night and no 'twas cold and still in Brooklyn last night and not a creature was stirring except for dog walkers and their walkees... And then we reached 7,000 followers!
Don't miss this chance to learn more about the lat Don't miss this chance to learn more about the latest developments in the restitution of Nazi-looted art. Tune in on December 15th at noon ET to hear from our panel members Amanda Buonaiuto, Peter J. Toren, Olaf S. Ossmann, Laurel Zuckerman, and Lilah Aubrey. The will be discussing updates from the HEAR act, it's implications in the U.S., modifications from the German Commission, and the use of digital tools and data to advance restitution research and claims. 

🎟️ Click the link in our bio to get tickets!
Making news is easy. Solving art crimes is hard. R Making news is easy. Solving art crimes is hard. Running a nonprofit is even harder.

Donate to the Center for Art Law to help us meet our year end goal! 

🔗 Click the link in our bio to donate today!
Join us for an informative short lecture and pro b Join us for an informative short lecture and pro bono consultations to understand contracts with galleries and art dealers.

The Artist-Dealer Relationships Clinic helps artists and gallerists negotiate effective and mutually-beneficial contracts. By connecting artists and dealers to attorneys, this Clinic looks to forge meaningful relations and to provide a platform for artists and dealers to learn about the laws that govern their relationship, as well as have their questions addressed by experts in the field.

After a short lecture on an artist-dealer relationships topic, attendees with consultation tickets will be paired with one of the Center's volunteer attorneys for a confidential 20-minute consultation. Limited slots are available for the consultation sessions.

🎟️ Grab tickets today using the link in our bio!

#centerforartlaw #artlaw #legalresearch #artistdealer #gallery #artist #legal #artlawyer
Want a tax break? Donate to the Center for Art Law Want a tax break? Donate to the Center for Art Law, we too can use a break.

🔗 Click the link in our bio to donate today!
On December 17, Join us for a discussion on the le On December 17, Join us for a discussion on the legal challenges and considerations facing General Counsels at leading museums, auction houses, and galleries. This program will offer insight into how legal departments navigate the complex and evolving art world.

The panel, featuring Cindy Caplan, General Counsel, The Jewish Museum, Jason Pollack, Senior Vice President, General Counsel, Americas, Christie’s and Halie Klein, General Counsel, Pace Gallery, will address a range of pressing issues, from the balancing of legal risk management with institutional missions, combined with the need to supervise a variety of legal issues, from employment law to real estate law. The conversation will also explore the unique role General Counsels play in shaping institutional policy.

This is a CLE Event. 1 Credit for Professional Practice Pending Approval.

🎟️ Make sure to grab your tickets using the link in our bio!
All great institutions require some propping up no All great institutions require some propping up now and then. Center for Art Law is doing a lot of heavy lifting AND we make you smile! Take a look at our FY2025 Annual Report, think of what we have done together, and of the projects to come. Help us reach our EOY fundraising goal of $35,000!

🔗 Use link in our bio to donate today! 

#centerforartlaw #givingtuesday #artlaw #legalresearch #2025 #endofyear #lawyer #artisticprinciples #legalframework
Art experiences are personal but they need not be Art experiences are personal but they need not be solitary. The Center strengthens the legal and visual arts community by making complex issues accessible to artists, scholars, students, and attorneys! Your support helps us expand research, education and advocacy that protects the creative field. Your contributions allow us to ensure these resources remain accessible to all.
  • About the Center
  • Contact Us
  • Newsletter
  • Upcoming Events
  • Internship
  • Case Law Database
  • Log in
  • Become a Member
  • Donate
DISCLAIMER

Center for Art Law is a New York State non-profit fully qualified under provision 501(c)(3)
of the Internal Revenue Code.

The Center does not provide legal representation. Information available on this website is
purely for educational purposes only and should not be construed as legal advice.

TERMS OF USE AND PRIVACY POLICY

Your use of the Site (as defined below) constitutes your consent to this Agreement. Please
read our Terms of Use and Privacy Policy carefully.

© 2025 Center for Art Law