Courts are increasingly confronting difficult questions about the authenticity and admissibility of video evidence because of the rise of “deepfakes”—videos or other digital media created using artificial intelligence to make it appear that real people said or did something they never actually said or did.
In a hotly disputed 4-3 decision, the New York Court of Appeals held that video recordings from a home camera appearing to depict the sexual abuse of a child were inadmissible because the Erie County Department of Social Services failed to sufficiently authenticate them.
The vigorous dissents warn that the majority’s reasoning could impose significant hurdles for the admission of video evidence, particularly in cases involving child abuse. Trial lawyers practicing in New York courts, particularly defense lawyers, should consider using the several significant tools this case likely affords them to oppose the admission of unfavorable video evidence.
Background and Procedural History
In December 2022, the New York Family Court found that M.H. (“mother”) had abused her two children. The key evidence offered by the Erie County Department of Social Services (“County”) consisted of several video recordings that appeared to show the mother’s former live-in boyfriend (“boyfriend”) sexually abusing the mother’s daughter in their home.
The recordings were obtained by the FBI in early 2022 after an unrelated suspect in a child-pornography investigation told an FBI agent that he had been “hack[ing] into security web cameras for the past few years.” Matter of M.S., No. 7, 2026 WL 436359 (N.Y. Feb. 17, 2026).
One of these cameras, which the suspected child pornographer said he had hacked in 2019, “showed what he thought was an adult male sexually abusing the man’s 15-year-old stepdaughter.” He told the agent that he “watched a lot of the security camera footage of this house” and “had saved some videos from that camera in a particular location on his computer, along with a screenshot that contained details about the security camera login information, including a possible name, email and IP address.”
The FBI agent found the recordings on the suspect’s computer and, “[b]ased on information from the screenshot on the suspect’s computer and other investigative work,” was able to identify the boyfriend. He notified local law enforcement, who conducted a search of the home, during which “police officers took several pictures of the living room which appeared to match details shown in the videos.”
Based largely on the recordings from the hacked security camera, the Family Court concluded that the mother had failed to protect her daughter from abuse. The Appellate Division, Fourth Department, affirmed the Family Court’s order, finding that the recordings had been correctly admitted, and that any uncertainties about the authenticity or reliability of the recordings went to the weight of that evidence rather than its admissibility. Matter of M.S., 229 A.D.3d 1040, 1042 (N.Y. App. Div. 2024), rev’d, 2026 WL 436359 (N.Y. Feb. 17, 2026).
The Court of Appeals Reverses
The New York Court of Appeals, in a 4-3 opinion, held that the recordings were inadmissible. Matter of M.S., 2026 WL 436359 (N.Y. Feb. 17, 2026).
The majority opinion relied on the standard for authentication of video evidence established by People v. Patterson, a 1999 case. Under that standard, video evidence can be authenticated through “testimony of a witness to the recorded events or of an operator or installer or maintainer of the equipment that the videotape accurately represents the subject matter depicted” or (2) expert or lay testimony establishing that the video “truly and accurately represents what was before the camera.” 710 N.E.2d 665, 668 (N.Y. 1999). The recordings, the majority held, failed to meet that standard.
First, according to the majority, neither the boyfriend, who had set up the cameras, nor the suspected child pornographer testified as the authenticity of the recordings, and the FBI agent could provide only hearsay testimony of what he had learned from the child pornographer about the provenance of the recordings.
Second, “there was a roughly two-and-a-half-year period between when the videos were stolen by the child pornographer and when they were recovered by the FBI.”
Third, the recordings recovered by the FBI were “only snippets excerpted from a longer, unrecovered feed” and the child pornographer “had not “explain[ed] what or how he selected and what he omitted or deleted,” nor had he told the FBI agent that the recordings “were unaltered.”
The possibility of alteration was the key motivation behind the majority’s ruling. The FBI agent testified that he had made no “observations that led [him] to believe that the video footage had been tampered with or altered in any way,” but the majority noted that the FBI agent “did not testify as to any training or experience in identifying signs of tampering or video alteration that could establish his ability to authenticate as a lay witness. He was not asked whether he examined the videos to look for tampering; whether he used any forensic tool to detect tampering; or even whether he was able to offer a learned affirmative opinion that the video was not tampered with.”
The majority further observed that the FBI agent “was concededly familiar with child pornography through his experience, but the County never sought to qualify him as an expert (or establish his experience) in video authentication,” adding that “[b]road familiarity with a subject does not make someone learned in detecting evidence of tampering or fabrication.”
Concern Over Deepfakes
By characterizing Patterson as being “on all fours with” Matter of MS, the majority positions its analysis as a straightforward application of a nearly 30-year-old admissibility standard about video evidence.
But as the two dissenting opinions strenuously argued, the majority gave short shrift to the compelling circumstantial evidence of authenticity actually present in the record. This dismissiveness suggests that the majority’s real concern is how to wrestle with the heightened risk of deepfakes, and to address that risk it sidestepped a faithful application of Patterson to set a heightened admissibility standard.
The Circumstantial Evidence for Authenticity Was More Compelling Than the Majority Let On. The majority opinion denied “that circumstantial evidence is never relevant to authentication,” instead concluding merely that “the circumstantial evidence offered here is insufficient.” As the dissenting judges articulated, however, the circumstantial evidence that the majority omitted or glossed over was compelling.
- The majority noted that law enforcement had “observed cameras in the house” without mentioning that, far from merely being present, the cameras were in the right “position to capture the sexual abuse as depicted in the videos.”
- The majority conceded that “some identifying features” of the home visible in the recordings were corroborated by photos taken of the home by law enforcement, when in fact the exact “same couch, painting, afghan, end table and lamp” were found in the home.
- The majority acknowledged that law enforcement found “sex toys of the kind depicted in the videos” during their search of the home, but failed to mention that these objects were taken from “the boyfriend’s fingerprint-locked bedroom (where there were no cameras),” rendering the fabrication to include these objects in the recordings impossible unless the child pornographer already knew that these sex toys existed by “somehow also gather[ing] innocuous footage of specific sex toys owned by the boyfriend” at some other time.
- The majority omitted the fact that, in one of the recordings, after the daughter rubbed her arm, the boyfriend asked her “if that was where the daughter ‘got her shot’—during a video that, [the County] later proved, was date-stamped shortly after the daughter, in actual fact, received two vaccinations at a doctor’s office”—also an impossible fabrication without other footage to alert the fabricator that the daughter had received the vaccinations that day.
The Majority’s True Goal Was To Heighten the Authentication Standard In Light of the Risk of Deepfakes. The majority framed its ruling as simply an application of Patterson, and not a new, tougher standard necessitated by “the far greater ability to manipulate video images now as compared to 1999, when [it] decided Patterson.” But the majority’s concern about deepfakes suffused its entire analysis.
For example, in Patterson, the fact that a video was an “accurate depiction” of a location’s “actual physical layout” was insufficient for authentication, and the Matter of MS majority concluded that the fact that the recordings in M.S. “apparently accurately depicted the home” was “not meaningfully different.” As framed by the majority, that application of Patterson required no elaboration. But the majority took care to add the observation that if “such testimony was insufficient then, the increasing prevalence of ‘deepfake’ videos has only rendered the method of matching circumstantial details in a video to personal observations a more suspect form of authentication.”
Similarly, set against the “remote” “possibility that the shopkeeper in Patterson could have altered the videotape in question in two weeks,” the majority found no difficulty in concluding that the “long gap in the chain of custody of a key piece of evidence—65 times as long as the gap in Patterson—raises further doubts about the authenticity of the videos.” But, without apparent need if the extended gap alone was a problem for authentication, the majority took care to note the child pornographer’s “ability to hack into computers and cameras of far-away people, with whom he had no connection, suggests he possessed the technical savvy to manipulate video images.”
Next, although the majority described “whether phony videos could be created by 2022” as an “irrelevant and abstract question,” it nevertheless dueled with one of the dissenting opinions about that possibility and concluded that “the technology to make realistic manipulated videos was widely available then.”
Finally, and perhaps most revealing, the mother had not argued to the Family Court that the recordings were deepfakes or “that the portions of the videos showing sexual abuse were fabricated.” For an appellate court that usually refuses to consider arguments not raised in a trial court, the choice to countenance a new argument indicates how eager the majority was to address the deepfakes issue.
Potential Ramifications of the Majority’s Ruling
The dissenting opinions highlight several possible ramifications of the majority’s decision.
The role of circumstantial evidence for authentication. The majority contends rather tepidly that its holding does not mean that “circumstantial evidence is never relevant to authentication.” But its failure to explain “what other forms of circumstantial evidence could possibly suffice” and its deeply jaundiced eye toward the circumstantial evidence offered that the recordings were authentic gives some support to a dissenting judge’s belief that the decision “effectively h[eld] that circumstantial evidence can never be enough.”
Presumption of alteration and required expert testimony. Because the mother offered no evidence the videos were altered or fabricated, the majority’s holding may, in practice, establish a presumption that all video evidence is inadmissible unless conclusively demonstrated not to have been altered or fabricated wholesale. That authentication standard may be prohibitively difficult to clear in practice. The majority faulted the FBI agent’s testimony because he “did not testify as to any training or experience in identifying signs of tampering or video alteration,” and “was not asked whether he examined the videos to look for tampering; whether he used any forensic tool to detect tampering; or even whether he was able to offer a learned affirmative opinion that the video was not tampered with.” Although the majority does not hold that expert evidence is necessary in this context, it is difficult to imagine how a lay witness could offer the testimony apparently demanded by this opinion.
Conclusion
Only time will tell how this decision affects the authentication analysis of video evidence offered in New York courts. But trial lawyers faced with potentially negative video evidence should make use of all the tools potentially afforded by this decision by emphasizing the possibility of a deepfake video, challenging the chain of custody of that video, and demanding expert authentication.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.
[View Source]