ARTICLE
30 January 2026

AI In The Courtroom: How Proposed Rule 707 Could Shape Evidence Standards

SJ
Steptoe LLP

Contributor

In more than 100 years of practice, Steptoe has earned an international reputation for vigorous representation of clients before governmental agencies, successful advocacy in litigation and arbitration, and creative and practical advice in structuring business transactions. Steptoe has more than 500 lawyers and professional staff across the US, Europe and Asia.
Consider a court case in which one side presents a chart, report, or a video reconstruction that was created entirely by artificial intelligence (AI).
United States Technology
Steptoe LLP are most popular:
  • within Transport topic(s)

Consider a court case in which one side presents a chart, report, or a video reconstruction that was created entirely by artificial intelligence (AI). For example, imagine a fraud trial where the defense seeks to introduce an AI-produced forensic accounting report trained on millions of financial records to show that the government's theory does not align with the actual data. Whether and under what conditions such evidence should be admitted is an evidentiary question, but the Federal Rules of Evidence do not currently address this question. The federal judiciary is now asking how courts should evaluate these increasingly common forms of AI-generated analyses.

Until February 16, 2026, the Advisory Committee on Evidence Rules is accepting public comments on several proposed changes to the Federal Rules of Evidence. Among these proposals is Rule 707, which would establish how courts should evaluate and regulate evidence produced by AI.

The proposal responds to concerns that AI-generated evidence might bypass the reliability standards required under Federal Rule of Evidence 702, which governs expert testimony. If adopted, Rule 707 would require that AI-produced evidence meet the same standards as expert testimony. In practice, this means that anyone seeking to introduce such evidence would need to show that it is the product of reliable principles and methods, and that those principles and methods were applied reliably to the facts of the case.

Under the proposed text, Rule 707 makes clear that when evidence produced by a machine or AI system is offered without an expert witness, it may be admitted only if it satisfies the reliability and methodology requirements of Rule 702(a)‑(d). This includes demonstrating that the evidence is based on sufficient data, produced through reliable principles and methods, and that those principles were reliably applied to the facts of the case. The rule explicitly states that it does not apply to outputs from simple scientific instruments, focusing instead on more complex AI-generated evidence where reliability is not self-evident.

If adopted, Rule 707 could become a valuable tool for white-collar defendants, particularly those seeking to use advanced analytics to rebut the government's narrative. A defendant who wants to introduce an AI-driven forensic accounting report would have a defined pathway to do so: demonstrating the system's methodology, data quality, and reliability in the same way an expert would lay a foundation under Rule 702. The rule thus brings clarity about what defendants must show for such evidence to reach a jury.

The proposed rule is scheduled for a vote by the Evidence Rules Committee on May 7, 2026. If the Committee approves it, the amendment would then move to the Judicial Conference for consideration, and, if endorsed there, would proceed through the required Supreme Court and congressional review under the Rules Enabling Act before it could take effect.

The draft Rule 707 can be found here.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

[View Source]

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More