- within Technology topic(s)
- in United States
- with readers working within the Securities & Investment and Law Firm industries
- within Technology, Real Estate and Construction and Strategy topic(s)
The rapid growth of deepfakes and other AI generated media is no longer only a content moderation problem, it is increasingly a litigation and evidence problem for Indian courts. The IT Rules 2026 of India establish traceability and labelling requirements for synthetic content which litigants and judges must examine how these requirements interact with statutory regulations concerning electronic evidence especially Section 65B of the Bharatiya Sakshya Adhiniyam. The article analyzes emerging problems of deepfake evidence and ongoing importance of Section 65B certification usage while detailing essential procedures that lawyers must follow to maintain digital evidence security.
1. Deepfakes And Electronic Evidence - The Legal Landscape
Deepfakes highly realistic but fabricated audio visual content created using AI pose a direct challenge to traditional assumptions about what video or audio “shows” in court. Academic commentary and enforcement focused reports warn that the same synthetic techniques used for non consensual intimate imagery, impersonation and fraud can also be deployed to manufacture or alter purported evidence. This undermines the longstanding evidentiary intuition that “seeing is believing” and raises new questions about authenticity, reliability and the standard for excluding manipulated material.
India’s statutory framework treats electronic records as a special category of evidence. Sections 65A and 65B of the Evidence Act now carried into the Bharatiya Sakshya Adhiniyam, 2023, create a self contained code for admissibility of electronic records, with Section 65B requiring specified conditions and certification from the person responsible for the computer system. Courts have repeatedly emphasised that authenticity and integrity are crucial because electronic records are “quite prone to manipulation”, a concern that deepfakes magnify exponentially.
2. Section 65B And AI Generated Evidence - What Has Not Changed
- Despite the technological shift, the core statutory test for admissibility of electronic evidence has not been rewritten specifically for deepfakes. Section 65B continues to require that.
- The electronic record is produced from a computer used regularly to store or process such information.
- The information was fed into the system in the ordinary course of activities.
- The computer was operating properly during the material period.
- A certificate under Section 65B(4) is furnished by the person occupying a responsible position in relation to the operation of the device.
Judicial commentary has established that Section 65A and 65B function as a specialized code which takes precedence over all general rules regarding secondary evidence. The requirement for proper certification exists because the original device must be presented, which creates a need for certification in most cases. The statutory requirements must still be fulfilled by parties who use digital evidence in deepfake cases because the actual challenge involves proving the authenticity of the material.
3. IT Rules 2026 - Traceability, SGI And Evidentiary Opportunities
The IT Rules 2026 amendments targeting synthetically generated information (SGI) add a new layer to the evidentiary conversation. Intermediaries enabling SGI are now expected to implement labelling, persistent metadata and other provenance mechanisms that can help distinguish AI generated content from authentic media. They are also required to maintain logs and cooperate with investigations, particularly where prohibited deepfakes are involved.
For litigants, this creates both opportunities and responsibilities. On one hand, embedded watermarks, unique identifiers and platform side logs may make it easier to show that a particular file has (or has not) been altered between creation and presentation in court. On the other hand, failure to request or preserve such provenance data early in an investigation can irreversibly weaken the chain of custody and erode the weight courts are prepared to give to the evidence.
4. Authenticity, Chain Of Custody And Forensic Challenges
The scholarly work about deepfake evidence shows three interconnected challenges that need to be solved. To establish proof of authenticity, maintaining continuous evidence custody, and using forensic tools without providing misleading information about its capabilities.
- Authenticity - With AI tools able to fabricate or seamlessly modify faces and voices, courts cannot rely solely on visual inspection or witness familiarity to validate a recording. The combination of cryptographic checksums and metadata analysis with forensic examination of compression artefacts and generative signatures has become the preferred method for experts who need to determine whether a file has been altered.
- Chain of custody - Commentators stress that the process of chain of custody needs complete documentation for all transfers which start when data exits the original capture device and move through all messaging platforms and cloud storage until it arrives at forensic labs and finally reaches court filings. The defence can use any break in the evidence chain to claim that the original material was replaced with a deepfake or that the file underwent unauthorized changes during its transport.
- Forensic tools - While AI based detectors can sometimes identify patterns of synthetic generation, reports warn that detection is an arms race and that false positives or negatives are possible. Depending on one particular tool devoid of any clear guidelines that needed methodology in dealing with complex data could itself become grounds of challenge.
5. Practical Playbook For Litigators Handling Deepfake Evidence
Drawing on emerging practice notes and academic analysis, practitioners can adopt a structured approach when dealing with suspected deepfakes in Indian proceedings.
A. When relying on digital audio visual evidence
- Secure the original device and file - Wherever possible, obtain and preserve the original recording device and the first generation file, rather than only compressed or forwarded copies.
- Generate and record hashes - Create cryptographic hash values (for example, SHA 256) at the earliest stage and record them in a contemporaneous memo for later verification.
- Obtain Section 65B certification - Ensure a detailed certificate identifying the device, the process by which the copy was made, and the person responsible is prepared in compliance with statutory requirements.
- Seek platform logs and provenance - Where the content passed through an intermediary, promptly seek preservation and disclosure of server logs, metadata and any SGI labels under the IT Rules 2026 framework.
B. When challenging deepfake or AI generated evidence
- Raise authenticity challenges early - academic proposals suggest parties who claim deepfake fabrication to present initial evidence through metadata discrepancies and contextual evidence and expert assessments before they can request advanced investigation procedures.
- Request independent forensic analysis - Ask the court to appoint or permit independent digital forensics experts, with instructions to examine both the file and any available platform side logs or SGI markers.
- Question gaps in custody - Carefully map and test the chain of custody, probing unexplained time gaps, device changes or re encoding events where manipulation could have occurred.
6. Interaction With Constitutional Rights And Fair Trial Guarantees
The AI generated evidence analysis for criminal trials demonstrates that deepfake technology extend beyond technical aspects to their constitutional issues. The use of potentially altered media becomes dangerous because it threatens both the presumption of innocence and the accused person's right to a fair trial especially when the defendant cannot afford expert witnesses to challenge the evidence. Deepfake crime victims who experience sexual exploitation and financial fraud hold a strong desire to protect their authentic digital evidence from being easily dismissed.
The courts need to establish new guidelines for managing disputed AI evidence because they must balance competing interests from different parties in each case. The new guidelines should specify when expert testing should occur and how courts should handle uncertain results and when digital evidence needs to meet increased verification standards before experts can use it to prove guilt. The IT Rules 2026 establish SGI traceability requirements and platform obligations which provide courts with extra resources however, they do not resolve the fundamental due process issues that exist.
7. Key Takeaways For Practitioners
It can be understood that with the use of deepfakes, it shows how easily traditional evidentiary standards break while they increase the need for complete Section 65B compliance and chain of custody procedures and immediate digital forensic work. The IT Rules 2026 SGI framework does not replace these duties; instead, it creates new sources of provenance data and new expectations from intermediaries that practitioners must learn to leverage. For litigators and in house counsel, the priority is to treat every potentially contentious digital recording as a high risk asset that demands careful preservation, documentation and technical scrutiny from the outset.
8. Practical FAQs On Deepfakes, Section 65B And IT Rules 2026
Q. Are deepfake videos admissible as evidence in Indian courts?
Deepfake videos are not automatically inadmissible. The legal system treats deepfake videos as electronic records which need to follow the same statutory requirements for evidence admissibility that applies to Sections 65A and 65B when the necessary conditions and certification are fulfilled. The actual dispute centers around how much value the court assigns to evidence which has been authenticated but whose reliability is now in question.
Q. Do the IT Rules 2026 change the need for a Section 65B certificate?
The IT Rules 2026 establish rules for intermediary obligations and SGI management yet leave the evidentiary law unchanged. Parties must continue to follow Section 65B requirements for electronic evidence because SGI related logs and metadata can assist in proving or disputing authenticity.
Q. How can lawyers prove that a recording is not a deepfake?
Lawyers can use various methods to prove their case through the secure collection of original devices, presentation of Section 65B certification, forensic report collection and the process of evidence comparison with independent sources and the use of platform side provenance and SGI labelling data when it applies. The courts may assess multiple technical and contextual to gather evidence from different fields, as no single method can provide complete success or assurance.
Q. What role do platforms play in evidentiary disputes?
The IT Rules 2026 require intermediaries to document SGI creation and apply metadata and they must help law enforcement agencies by providing requested information which establishes their role as the main intermediaries who maintain provenance data. The data they collect enables verification of the exact time content became available online and the specific location where it originated and the methods used to handle it and the presence of synthetic creation traits.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.