ARTICLE
16 March 2026

India's Digital Personal Data Protection Act And The DPDP Rules, 2025: Phased Commencement, Core Obligations And A Board-Ready Compliance Strategy

India's digital privacy framework has now moved from legislative intent to staged implementation.
India Privacy
Sumit Kochar’s articles from Dolce Vita Advisors are most popular:
  • within Privacy topic(s)
  • in United States
  • with readers working within the Advertising & Public Relations, Banking & Credit and Basic Industries industries
Dolce Vita Advisors are most popular:
  • within Privacy, Media, Telecoms, IT, Entertainment and Finance and Banking topic(s)

Introduction

India's digital privacy framework has now moved from legislative intent to staged implementation. The Digital Personal Data Protection Act, 2023 (DPDP Act) created the statutory architecture, but the compliance debate changed materially in November 2025 when the Central Government notified phased commencement, established the Data Protection Board of India, fixed its strength, and notified the Digital Personal Data Protection Rules, 2025 (DPDP Rules). For boards, founders, platform operators, regulated businesses and global groups serving the Indian market, the right question is no longer whether India has a general data protection law. The right question is what is already live, what comes into force later, and which controls should be sequenced now rather than deferred.

That distinction matters because the Indian regime is not commencing in one move. The institutional provisions are already in force. The Board exists. The rulebook exists. Some operational provisions, such as board functioning and the appointment framework, are already active, and the Consent Manager registration layer follows on a separate date before the balance of substantive obligations. The bulk of the core processing obligations, however, including notice, consent, legitimate uses, children's data, Significant Data Fiduciary obligations, rights handling, appeals, penalties and the central blocking power, are scheduled to come into force eighteen months from publication of the commencement notification.

This article analyses the DPDP Act and the final DPDP Rules as they stand today, with a focus on the legal structure of the statute, the staged commencement model, and the practical implications for Indian and foreign businesses offering goods or services into India. It also identifies the issues that merit immediate board attention: lawful-basis mapping, processor contracting, children's data flows, breach readiness, Significant Data Fiduciary preparedness, and the interaction between the DPDP framework and sectoral overlays under Indian law.

The phased commencement architecture is itself a compliance issue

The most important operational point is chronological. By notification dated 13 November 2025, the Central Government brought into force, with immediate effect, section 1(2), section 2, sections 18 to 26, sections 35, 38, 39, 40, 41, 42 and 43, together with sections 44(1) and 44(3). One year from publication, section 6(9), dealing with registration of Consent Managers, and section 27(1)(d) come into force. Eighteen months from publication, the remainder of the core framework comes into effect, including sections 3 to 17, section 27 except clause (d), sections 28 to 34, section 36, section 37 and section 44(2).

The Rules follow the same staggered pattern. Rules 1, 2 and 17 to 21 came into force on publication. Rule 4 comes into force one year from publication. Rules 3, 5 to 16, 22 and 23 come into force eighteen months from publication. This means that organisations should not treat the Rules as a single monolith. Some provisions are immediately relevant for institutional setup, governance and market architecture even where the main conduct rules become operational later.

On the current timeline, this produces three dates of practical importance: 13 November 2025 for institutional commencement; 13 November 2026 for Consent Manager registration and the corresponding Board function; and 13 May 2027 for the main processing, rights, enforcement and penalty architecture. From a governance perspective, this sequencing has two consequences. One, organisations cannot credibly argue that the law is not yet real. Two, the implementation window should be used to design systems that can survive later scrutiny by the Board, rather than to wait for the last quarter before the main obligations commence.

Date What commences Practical effect
13 Nov 2025 Selected Act provisions and Rules 1, 2 and 17 to 21 Board established, rule-making live, digital-office and appointment architecture operational
13 Nov 2026 Section 6(9), section 27(1)(d), and Rule 4 Consent Manager registration layer becomes operational
13 May 2027 Core conduct and enforcement provisions Notice, consent, legitimate uses, children's data, SDF obligations, rights, appeals, penalties and blocking power become active

The statute is narrower than a GDPR clone, but broader than a minimalist privacy law

The DPDP Act applies to digital personal data processed within India where such data is collected in digital form, or collected in non-digital form and digitised subsequently. It also applies extraterritorially where personal data is processed outside India in connection with any activity related to offering goods or services to Data Principals within India. That formulation is commercially significant. It clearly captures offshore platforms, apps, software businesses and service providers with Indian users or customers even if the processing stack is located elsewhere.

At the same time, the Act is not an omnibus privacy statute in the European mould. It does not regulate anonymised data. It is centred on digital personal data, lawful purpose, consent, specified legitimate uses, fiduciary accountability, board-led adjudication and a penalty schedule. The Act also carves out certain publicly available personal data from its scope where the data is made publicly available by the Data Principal themselves or by another person under a legal obligation to make it public.

The architecture is therefore targeted rather than conceptually maximalist. But businesses should not underestimate it. The extraterritorial hook, the penalties, the Board's powers, and the Central Government's ability to direct blocking of repeat offenders together make the framework commercially serious.

Illustrative example: foreign SaaS provider
A Singapore-based SaaS provider offering workflow software to Indian enterprise customers and processing employee or end-user information in the course of that service must analyse the DPDP Act even if its servers are outside India. The reason is not data localisation. The reason is the statute's express application to processing outside India where the processing is connected to offering goods or services to Data Principals within India.

Lawful grounds for processing: consent remains central, but section 7 is strategically important

a. Notice and consent

Section 4 states the core rule: personal data may be processed only for a lawful purpose, either on the basis of consent or for certain legitimate uses under section 7. That immediately creates two workstreams for any serious compliance programme. The first is consent design. The second is lawful-basis mapping for uses that are better analysed under section 7.

Section 5 requires every request for consent to be accompanied or preceded by notice. Section 6 then requires consent to be free, specific, informed, unconditional and unambiguous, given by clear affirmative action, and limited to personal data necessary for the specified purpose. The statutory design is therefore deliberately designed to prohibit the bundling of consents for multiple purposes into a single undiffrentiated request. . Consent cannot be presumed from silence. It must relate to a specified purpose. Withdrawal must be as easy as giving consent, and once consent is withdrawn the fiduciary must cease processing within a reasonable time, subject to consequences borne by the Data Principal and subject to processing already validly undertaken.

The final Rules materially operationalise this. Rule 3 requires the notice to be presented independently from other information, in clear and plain language, and to contain at least an itemised description of the personal data, the specific purposes of processing, and the communication link or other means by which the Data Principal can withdraw consent, exercise rights and make a complaint to the Board. In practice, this pushes Indian businesses away from dense omnibus privacy notices and toward layered or purpose-linked consent architecture.

The Act also contains an important evidentiary point. Where consent is the basis of processing and a dispute arises, the Data Fiduciary must prove that notice was given and consent was obtained in accordance with the Act and Rules. That makes audit trails and consent records a core legal risk issue, not merely a product-design issue.

b. Consent Managers

The Act recognises the Consent Manager as a registered, Board-supervised intermediary through whom a Data Principal may give, manage, review or withdraw consent. The final Rules set the first serious architecture for this market. Rule 4 and the First Schedule prescribe registration conditions and operational obligations. Among other things, the Consent Manager must be interoperable, must enable the Data Principal to give, deny or withdraw consent, must maintain records of consents and notices, and must ensure that shared personal data is not readable by it.

This is one of the most structurally interesting parts of the Indian regime. Consent management is not left as an informal product feature. It is turned into a regulated layer. Businesses that currently rely on fragmented consent states across apps, affiliates and third-party partners should assume that consent provenance and withdrawal synchronisation will become a point of regulatory focus.

c. Certain legitimate uses under section 7

Section 7 is not an afterthought. It is one of the most commercially important provisions in the Act. It allows processing without consent for defined uses, including where the Data Principal has voluntarily provided her data for the specified purpose and has not indicated refusal; where the State processes data for subsidy, benefit, service, certificate, licence or permit; for compliance with judgments or legal obligations; for responding to medical emergencies or public health threats; for disaster management; and for employment purposes or safeguarding the employer from loss or liability.

The provision should not be read as a general reasonable-purposes clause. It is a closed list. But within that list, the employment limb is especially consequential for Indian businesses. Section 7(i) expressly covers processing for employment purposes and to safeguard the employer from loss or liability, including prevention of corporate espionage, maintenance of confidentiality of trade secrets, intellectual property or classified information, and provision of employee benefits. That language is not limitless, but it is broad enough to support many internal HR, security, access-control and investigation workflows without awkward over-reliance on consent.

Illustrative example: employee investigation
Assume a company suspects exfiltration of confidential source code by a departing employee and reviews access logs, device records and message metadata generated within corporate systems. Framing this solely as consent-based processing would be weak because employee consent is often contested in practice. Section 7(i) provides a materially stronger legal anchor, provided the processing is genuinely tied to employment purposes or protection against loss or liability and remains proportionate.

General obligations of Data Fiduciaries: the operational core

Section 8 contains the operational backbone of the statute. Accountability cannot be outsourced. A Data Fiduciary remains responsible for compliance even where processing is carried out on its behalf by a Data Processor. Processors may be engaged only under a valid contract. Where personal data is likely to be used for decisions affecting the Data Principal or disclosed to another Data Fiduciary, the data must be complete, accurate and consistent. The fiduciary must implement appropriate technical and organisational measures to ensure observance of the Act and Rules, and it must protect personal data in its possession or control by taking reasonable security safeguards to prevent personal data breach.

The Rules turn these principles into more granular duties. Rule 6 requires reasonable security safeguards such as encryption, obfuscation or masking; access control and least-privilege measures; monitoring for unauthorised access; logs and data retention for one year to enable detection, investigation and remediation of unauthorised access; appropriate contractual provisions with processors; and technical and organisational measures to ensure effective observance of the safeguards. This is one of the clearest indicators that a merely paper-based privacy programme will not suffice.

Rule 7 requires breach intimation to affected Data Principals in a concise, clear and plain manner, with details such as the nature and extent of the breach, likely consequences, mitigation measures and recommended safety steps. A separate notice must also be given to the Board. Indian businesses should therefore build for dual-track breach communications: regulator-facing and individual-facing.

Section 8(9) and rule 9 further require publication of the business contact information of the Data Protection Officer, if applicable, or another person who is able to answer questions about processing. The contact details must appear prominently on the website or app and must also be mentioned in responses to rights-related communications. In practice, this forces organisations to designate an accountable human interface for privacy compliance even where they are not yet Significant Data Fiduciaries.

Rights and duties of Data Principals: not an ornamental chapter

The rights chapter is shorter than in some foreign laws, but it is operationally meaningful. Section 11 gives the Data Principal the right to access information about the personal data being processed. Section 12 gives the right to correction, completion, updating and erasure, subject to retention being necessary for the specified purpose or for compliance with law. Section 13 requires readily available means of grievance redressal and requires the Data Principal to first exhaust the fiduciary or Consent Manager grievance mechanism before approaching the Board. Section 14 adds a right to nominate another individual who may exercise the Data Principal's rights upon death or incapacity.

Importantly, the Act also imposes duties on Data Principals under section 15. These include complying with applicable laws while exercising rights, not impersonating another person, not suppressing material information while providing personal data, not registering false or frivolous grievances or complaints, and furnishing only authentic information while exercising correction or erasure rights. This is unusual in comparative terms and will likely be invoked by businesses in cases involving abusive complaints or identity misuse. It should not be overused as a shield against legitimate rights requests, but it is a real statutory feature.

From a product and records perspective, the rights chapter means that businesses should prepare not only privacy notices but also a rights-operating model: intake channels, identity verification, internal routing, response templates, correction workflows, erasure holds and nomination handling.

Erasure, retention and the specified-purpose-no-longer-being-served problem

One of the more operationally demanding parts of the regime is the interaction between section 8(7), section 8(8) and rule-based retention periods. The Act requires erasure once the Data Principal withdraws consent or once the specified purpose is no longer served and retention is not necessary for legal compliance. The Rules then deem the specified purpose to be no longer served for certain classes of large consumer-facing fiduciaries after defined periods of inactivity.

The Third Schedule is particularly significant for large digital businesses. It sets inactivity-based timelines for certain e-commerce entities, online gaming intermediaries and large social media intermediaries. For specified classes of entities, three years from the date on which the Data Principal last approached the Data Fiduciary for performance of the specified purpose or exercise of rights, or the commencement of the Rules, whichever is later, becomes the relevant period. At least forty-eight hours before this erasure trigger, the Data Principal must be informed that the data will be erased unless she logs into her user account, initiates contact, or exercises rights.

This is not just a retention issue. It is a product issue, a records issue and a defensibility issue. Businesses will need to define the specified purpose with care, map inactivity logic across systems, and reconcile erasure with legal hold, tax, fraud-prevention, sectoral retention and cybersecurity log-retention obligations.

At the same time, the Rules include a separate one-year minimum retention of personal data, associated traffic data and other processing logs for specified purposes under the Seventh Schedule. This means that immidiate erasureis not always the legally correct answer. The real compliance discipline lies in reconciling erasure triggers with mandatory short-term retention for security, accountability and state information requests.

Illustrative example: marketplace inactivity
A large e-commerce platform cannot simply retain dormant user data indefinitely because the account may become useful later for analytics or reactivation campaigns. Once the rule-based period is reached and the user has not interacted for the specified purpose or rights exercise, the platform must be prepared to erase, subject to legal retention requirements and the mandatory notice before erasure.

Children's data: a strong default rule with carefully designed carve-outs

Section 9 creates a strict baseline for children's data. A Data Fiduciary must, before processing the personal data of a child, obtain verifiable consent from the parent or lawful guardian. It must also refrain from processing likely to cause any detrimental effect on the well-being of a child and from tracking, behavioural monitoring or targeted advertising directed at children.

The final Rules are especially important here. Rule 10 requires technical and organisational measures to ensure that verifiable parental consent is obtained, and due diligence to verify that the person claiming to be a parent is an identifiable adult. Rule 11 performs a similar function for persons with disability who have lawful guardians.

However, the Rules do not stop at a blanket prohibition. Rule 12 and the Fourth Schedule create targeted exemptions from section 9(1) and 9(3) for specified classes of Data Fiduciaries and specified purposes, subject to conditions. Part A includes clinical establishments, mental health establishments, healthcare professionals and allied healthcare professionals to the extent necessary for protecting the child's health. It also includes educational institutions for tracking and behavioural monitoring for educational activities or child safety; crèches and day care settings for safety-related monitoring; and transport providers engaged by schools or child care centres for location tracking in the interests of children's safety during travel.

Part B contains purpose-based exemptions, such as exercising powers or functions in the interests of the child under law, providing subsidy or benefits in the interests of the child, creating an email-only user account, determining real-time location in the interests of safety and protection, and preventing access by children to information, services or advertisements likely to have a detrimental effect on their well-being.

These carve-outs are carefully drafted and should not be over-read. They do not create a general exemption for all edtech, all child-facing products or all age-verification tools. The processing must stay within the stated purpose and conditions.

Illustrative example: school bus tracking
If a school engages a transport provider to track the real-time location of enrolled children during school travel for safety purposes, the Rules specifically contemplate that use in the Fourth Schedule. That is materially different from an entertainment app behaviourally profiling children for engagement optimisation, which would face a far more difficult justification.

Significant Data Fiduciaries: where governance moves from privacy policy to board architecture

The Act authorises the Central Government to notify any Data Fiduciary or class of Data Fiduciaries as a Significant Data Fiduciary on factors such as volume and sensitivity of data processed, risk to rights, impact on sovereignty and integrity, risk to electoral democracy, security of the State and public order. That criterion design shows that SDF designation is not just about size. It is about systemic consequence.

Once notified, the obligations become more demanding. Under section 10, an SDF must appoint a Data Protection Officer who is based in India, is responsible to the Board of Directors or similar governing body, and acts as the key point of contact for grievances. It must appoint an independent Data Auditor. It must undertake periodic Data Protection Impact Assessments and such other measures as may be prescribed.

The Rules then sharpen the obligation. Rule 13 requires an SDF, once in every twelve months from notification, to undertake a Data Protection Impact Assessment and an audit to ensure effective observance of the Act and Rules. The person carrying out the assessment and audit must furnish to the Board a report containing significant observations. The Rule also requires due diligence to verify that technical measures, including algorithmic software used for hosting, display, uploading, modification, publishing, transmission, storage, updating or sharing of personal data, are not likely to pose a risk to the rights of Data Principals.

This is one of the clearest statutory hooks for governance of automated systems within India's privacy framework. The obligation is not framed as a standalone AI law. But it is clearly designed to capture risks arising from algorithmic systems used in large-scale digital environments.

Practical implication
Businesses that may plausibly fall into the SDF category should not wait for formal designation to begin DPIA-style discipline. They should identify high-risk processing, prepare governance reporting lines, define DPO reporting structures and create a methodology for algorithmic risk review. When designation comes, the lagging organisation will not merely be late; it will be structurally unprepared.

AI, Automated Decision-Making and the DPDP Act: Why Data Protection Will Be India's First-Line AI Governance Tool

India's emerging AI governance model suggests that, at least in the near term, data protection law will function as the principal horizontal compliance layer for AI systems that process personal data. This is consistent with the Government's recently issued India AI Governance Guidelines, which favour a balanced and innovation-oriented approach, emphasise regulation of AI applications rather than the underlying technology itself, and indicate that many AI-related harms can be addressed through existing laws rather than a standalone omnibus AI statute at this stage. In that framework, the DPDP Act assumes particular significance because it already applies across sectors and technology stacks wherever "digital personal data" is processed for a lawful purpose.

For AI developers, deployers, fine-tuners and enterprise adopters, the practical implication is straightforward: if an AI workflow involves personal data of individuals in India, the question is no longer whether the system is "AI regulated" in the abstract, but whether the relevant processing activity can be lawfully anchored within the DPDP framework. That inquiry runs through the full lifecycle of the model or application, including ingestion of training and fine-tuning data, customer support interactions, employee-monitoring tools, recommendation engines, fraud-detection systems, biometric or behavioural analytics, and generative AI tools that retain prompts or outputs containing personal data. The Act's core architecture: notice, consent or another statutory ground, purpose limitation, security safeguards, contractual control over processors, special protections for children, and accountability of the Data Fiduciary applies in a technology-neutral manner and therefore maps directly onto AI deployments even though the statute does not use AI-specific terminology.

This is precisely where the AI Governance Guidelines become useful as an interpretive and strategic signal. The Guidelines identify several AI-specific issues that sit at the intersection of data protection and AI governance: the use of publicly available personal data for model training, the compatibility of collection and purpose limitation principles with modern AI architectures, the role of consent managers in AI workflows, the need for contextual and dynamic notices in multimodal or ambient-computing environments, and the allocation of liability across the AI value chain. In other words, the policy conversation in India is not whether AI should sit outside data protection law; it is whether the existing data-protection framework requires calibrated refinements to better address AI-specific realities. Until those refinements arrive, organisations should assume that the DPDP Act is the operative baseline.

That baseline has concrete compliance consequences. A company deploying an AI-enabled hiring tool, for instance, cannot treat candidate data merely as "input" into a software workflow. If resumes, psychometric data, interview transcripts or behavioural indicators are digitised and processed, the enterprise must identify the lawful basis for such processing, ensure that notices are properly framed, assess whether the processing remains within the disclosed purpose, and impose processor controls if the AI vendor is acting on its behalf. The same logic applies to a hospital deploying AI-assisted diagnostics, a lender using machine learning for credit assessment, or a retailer using personalised recommendation engines. In each case, the DPDP analysis must sit alongside sector-specific law, contractual architecture and internal governance. AI does not displace data protection compliance; it intensifies it.

The more difficult cases arise where AI systems are trained or improved using large, mixed and continuously refreshed datasets. Here, the compliance challenge is not just collection, but repurposing. A dataset obtained for one customer-facing purpose may later be used for model optimisation, safety tuning, analytics, or internal product improvement. That creates immediate questions under the DPDP architecture around purpose specificity, adequacy of notice, validity of consent design, and whether the processing can properly fit within one of the Act's recognised statutory pathways. The AI Governance Guidelines acknowledge these tensions and recommend that India continue with a principle-based and targeted approach rather than prematurely imposing a rigid AI licensing regime. For businesses, however, that policy flexibility should not be mistaken for legal permissiveness. If personal data is involved, repurposing risk is real, and compliance should be documented contemporaneously rather than reconstructed after deployment.

A second major theme is accountability across the AI value chain. The Guidelines repeatedly emphasise the need for transparency about the roles of developers, deployers and users, and call for clearer classification and liability rules in India's broader digital-law framework. That recommendation is especially important in the DPDP context because AI supply chains are often fragmented: one entity collects the data, another provides the foundational model, a third fine-tunes it, and a fourth embeds it into an enterprise product. Yet under the DPDP Act, the Data Fiduciary remains the central accountability node vis-à-vis the Data Principal, even where a Data Processor is engaged. As a result, enterprises cannot contract out of responsibility merely by relying on an external AI vendor. Vendor due diligence, technical validation, contractual allocation of roles, audit rights, security testing and incident management become essential parts of AI compliance design.

The Guidelines also add an important normative layer that is not expressed in the DPDP Act in AI-specific language but is likely to influence future regulatory expectations: people first, fairness and equity, accountability, understandable by design, and safety, resilience and sustainability. These principles do not yet create a standalone statutory AI code, but they provide a credible compliance blueprint for high-impact AI systems. In practice, that means organisations using AI with personal data should move beyond narrow checkbox privacy compliance and adopt defensible governance controls such as dataset lineage reviews, bias and discrimination testing, human escalation protocols, model-change logs, output validation for material decisions, explainability standards for user-facing systems, and documented thresholds for when automated outputs must be reviewed by a human decision-maker. In sectors such as employment, healthcare, finance, education and public services, those controls are increasingly part of prudent legal risk management, even where not yet mandated through a separate AI law.

A short illustration makes the point. Consider a consumer-facing generative AI assistant used by an e-commerce platform. If the tool retains chat prompts, profiles purchasing patterns, and personalises outputs using prior customer interactions, the system raises ordinary DPDP questions around lawful basis, notice, purpose limitation, security safeguards and data-retention controls. But it also raises AI-governance questions around transparency of automated assistance, accuracy of outputs, discriminatory or manipulative profiling, and content provenance where synthetic content is generated. The compliance answer cannot come from privacy, technology, marketing and litigation teams working in silos. It requires an integrated governance model in which DPDP compliance is the legal floor, and AI-specific trust-and-safety controls are the operational overlay.

The strategic conclusion is therefore clear. In India, AI governance is likely to evolve through a techno-legal layering exercise rather than through a single transformative statute. For now, the DPDP Act is the most immediate and operationally relevant part of that stack wherever AI systems process personal data. Organisations that treat the DPDP Act as merely a privacy-policy exercise will fall short of the required compliance standard.. Organisations that treat it as the foundation of broader AI governance and build role clarity, dataset discipline, human oversight, bias controls, provenance safeguards and incident response around it, will be better positioned for both present compliance and future regulatory change.

Why the DPDP Act Will Help Some Digital Businesses, but Will Not Fully Resolve the VPN Problem

The Digital Personal Data Protection Act, 2023 and the Digital Personal Data Protection Rules, 2025 will not affect all digital sectors in the same way. Businesses whose operating model depends on persistent identity data, telemetry, account-level traceability, cross-border routing or long-tail retention will feel the regime more sharply than businesses that process limited transactional data. VPN providers are the clearest example because they sit at the intersection of two competing regulatory narratives in India: privacy-enhancing technology on the one hand, and cyber-security traceability and investigatory access on the other. The DPDP framework unquestionably improves legal structure around notice, lawful basis, security and user rights. But it does not erase older cyber-security obligations that pull in the opposite direction.

That tension predates the DPDP Act. In April 2022, CERT-In issued directions under section 70B of the Information Technology Act, 2000 requiring data centres, VPS providers, cloud service providers and VPN service providers to collect and retain specified subscriber information, including validated customer names, period of hire, allotted IP addresses, registration-time email and IP details, purpose of hiring the service, validated address and contact details, and ownership pattern, for five years after cancellation or withdrawal of registration. The same directions also require logs of ICT systems to be enabled and maintained securely for 180 days, with the logs to be maintained within Indian jurisdiction. CERT-In later clarified in its FAQs that these directions do not apply to enterprise or corporate VPNs used internally, but do apply to consumer-facing VPN services that provide internet-proxy-like functionality to the general public. Reuters also reported in 2022 that major global VPN providers responded by removing physical servers from India rather than comply with the retention model, illustrating that the legal issue was not merely theoretical.

Against that backdrop, the DPDP Act does solve part of the compliance puzzle for VPNs and similarly situated service providers. First, it gives a clear statutory architecture for processing personal data: processing must be for a lawful purpose, either on consent or for a recognised "certain legitimate use." Secondly, section 7(d) expressly permits processing for fulfilling an obligation under Indian law to disclose information to the State or its instrumentalities, and section 6(6) permits continued processing after withdrawal of consent where processing is required or authorised by the Act, the Rules or any other law in force. Thirdly, section 8(7) separately preserves retention where it is necessary for compliance with another law. For businesses that were previously navigating cyber-security directions, contracts, sector-specific rules and privacy expectations without a single data-protection statute, that is a real improvement: the legal basis for mandated retention and disclosure is now more legible, and the provider's obligations toward notices, rights handling, processor contracts, security safeguards and breach response are now much clearer.

Even so, the Act does not "solve" the VPN problem in the business-model sense. A privacy-focused VPN typically markets itself on data minimisation, low traceability and no-log or near-no-log positioning. The CERT-In directions, by contrast, require validated identity information and traceability-related records for years after service termination. The DPDP Act does not repeal those directions, nor does it create a supremacy clause that wipes away sectoral logging or disclosure mandates. On the contrary, the Act is drafted to coexist with such mandates by allowing processing and retention where another law requires it. Legally, that means a VPN provider can structure a compliance justification. Commercially, however, the harder question remains: can a provider still present itself to users as privacy-maximising if Indian law requires customer-identifying data and service-use metadata to be retained? For many consumer VPN providers, that tension remains unresolved, which is why the Act should be seen as a compliance framework, not as a policy reset in favour of strict anonymity.

The same point extends beyond VPNs to cloud, VPS and infrastructure providers. The good news for those sectors is that the DPDP Act does not adopt a blanket data-localisation model. Section 16 instead empowers the Central Government to restrict transfers to notified jurisdictions, which is materially more workable than an across-the-board localisation rule. That architecture should make it easier for cloud, SaaS and multinational service providers to continue using regional or global infrastructure stacks, subject to any future country restrictions and sector-specific requirements. But the uncertainty is not gone. The 2025 Rules allow the Central Government, in the case of Significant Data Fiduciaries, to require that specified personal data and even traffic data relating to its flow not be transferred outside India. Read together with the continuing CERT-In regime, this means that India has not moved to a pure "free flow of data" model. It has instead adopted a layered system in which general transfer flexibility coexists with targeted restrictions, incident-reporting duties, contract requirements and retention mandates.

The DPDP regime will also alter the compliance posture of consumer internet businesses that are not in the VPN market at all. Large e-commerce entities, online gaming intermediaries and social media intermediaries are expressly addressed in the Third Schedule to the 2025 Rules. For such entities above the specified user thresholds, personal data generally must be erased if the user has not approached the platform for the relevant purpose or exercised rights for a period of three years, subject to carve-outs such as maintaining access to the user account or virtual tokens. That is a major shift for businesses built on indefinite account dormancy, long-tail analytics and passive retention. For those sectors, the Act and Rules solve an old ambiguity by finally setting a statutory baseline for notice, consent withdrawal, security safeguards and inactive-account retention. But they also create operational burdens: platforms will need re-engineered retention clocks, deletion workflows, user-warning mechanisms and product-level logic to distinguish between "inactive" data that must be erased and data that must be preserved for live accounts, fraud controls, token access or legal compliance.

A practical example shows how these layers now interact. Take a consumer VPN provider offering services to users in India. Because it offers services to individuals in India, the DPDP Act can apply even if parts of the processing occur outside India. If the provider collects names, emails, billing details, IP-related metadata or device-linked support records, it must now map those flows against the Act's lawful-basis architecture, provide compliant notices, implement the Rule 6 security safeguards, and be prepared to notify both affected users and the Board in the event of a personal data breach. But it must also continue to assess its exposure under the CERT-In directions, which independently require retention of specified subscriber information and logs. In short, the DPDP Act makes the privacy side of the compliance programme more coherent; it does not remove the surveillance-traceability side of the compliance burden.

The larger lesson is that the DPDP Act is best understood as a baseline data-governance law, not a one-statute solution to every sectoral contradiction. It will help industries by standardising lawful-basis analysis, processor accountability, breach response, user-rights handling and erasure logic. But the sectors facing the hardest issues today, especially VPNs and certain infrastructure providers, are not struggling because India lacked a privacy statute alone. They are struggling because privacy, cyber-security, traceability and investigatory access are being regulated through overlapping instruments that do not always pull in the same direction. Unless those overlaps are further harmonised through guidance, future rulemaking or sector-specific clarifications, the DPDP Act will improve compliance design without fully removing the underlying policy tension.

Cross-border transfers, exemptions and statutory spillover

Section 16 adopts a notification-based restriction model. The Central Government may, by notification, restrict transfer of personal data by a Data Fiduciary for processing to any country or territory outside India as may be notified. The section also preserves the operation of other Indian laws that impose a higher degree of protection or restriction in relation to certain data or classes of fiduciaries.

The important negative implication is that the statute does not create a general localisation mandate or an exhaustive adequacy-list architecture. But that should not be mistaken for unrestricted permissibility. Section 16 allows targeted future restrictions, and sectoral or contract-specific rules may still impose tighter controls. For multinational groups, this means transfer analysis in India should be reframed. The starting point is not simply whether export is prohibited. The starting point is what data is being processed, which Indian-facing activity triggers the Act, what other sectoral laws apply, and whether any section 16 notification or contract-specific control alters the position.

The DPDP Act also contains a series of exemptions in section 17, including for enforcing legal rights or claims, judicial and quasi-judicial functions, prevention or investigation of offences, offshore processing of non-Indian Data Principals under contracts with foreign persons, mergers and amalgamations approved by authorities, and certain classes of fiduciaries or processing by government notification. Rule 16 separately exempts processing for research, archiving or statistical purposes where it is carried on in accordance with the standards in the Second Schedule.

Section 44 deserves mention as well because it creates statutory spillover beyond the privacy chapter. It omits section 43A of the Information Technology Act, 2000, amends section 81 of that Act to recognise the DPDP Act, and substitutes clause (j) of section 8(1) of the Right to Information Act, 2005 with the phrase information which relates to personal information. These linked amendments are part of the reason the DPDP Act should be read as a system statute rather than a standalone compliance silo.

The Data Protection Board of India is now a real institution, not a future placeholder

By notification dated 13 November 2025, the Central Government established the Data Protection Board of India with effect from publication. The Board's head office is in the National Capital Region. A separate notification fixed the Board's strength at four members. That institutional commencement is highly relevant because it signals that the enforcement framework is being operationalised before the bulk of substantive obligations come into force.

The Board's powers under section 27 are broad. They include taking cognisance of personal data breach intimations, ordering urgent remedial or mitigation measures, inquiring into breaches on complaints or government references, inquiring into breaches by Consent Managers, and performing other functions assigned under the Act. Section 28 requires the Board to function as an independent body and, as far as practicable, as a digital office. The Act also makes clear that the Board is guided by the principles of natural justice rather than being bound by the Code of Civil Procedure.

The Rules already prescribe key elements of board functioning, remuneration, meeting procedure, digital operations and appeals. Rule 20 provides that the Board shall function as a digital office and may adopt techno-legal measures to conduct proceedings without requiring physical presence. The Rules further contemplate that inquiries should ordinarily be completed within six months, extendable by recorded reasons for further periods not exceeding three months at a time.

Appeals to the Appellate Tribunal are to be filed digitally, and the appeal fee is aligned to the Telecom Regulatory Authority of India Act regime, payable digitally through UPI or another authorised system. Under section 29 of the Act, appeals are to be filed within sixty days, with a power to condone delay for sufficient cause, and the Tribunal is expected to endeavour to dispose of appeals within six months.

Penalties are headline-grabbing, but repeated non-compliance creates a more strategic risk

The Schedule to the Act sets out penalties that are commercially meaningful. Breach of the obligation to take reasonable security safeguards may attract a penalty of up to INR 250 crore. Breach of the obligation to notify the Board or affected Data Principals of a personal data breach may attract up to INR 200 crore. Breach of children's data obligations may attract up to INR 200 crore. Breach of SDF obligations may attract up to INR 150 crore. Residual breaches may attract up to INR 50 crore.

The Act also permits mediation in appropriate cases and voluntary undertakings. Section 32 allows the Board to accept a voluntary undertaking at any stage of the proceedings, including commitments to take or refrain from specified action, and acceptance ordinarily bars further proceedings on the contents of that undertaking. But breach of the undertaking reopens risk, and the penalty can run up to the amount applicable to the underlying breach.

The more strategic risk lies in section 37. Where the Board has imposed monetary penalty on a Data Fiduciary in two or more instances, and advises blocking in the interests of the general public, the Central Government may direct an intermediary or agency to block public access to information enabling that fiduciary to offer goods or services to Data Principals in India. For consumer technology businesses, repeated privacy non-compliance can therefore become an access-to-market issue, not merely a fine issue.

Illustrative breach category Maximum penalty
Failure to take reasonable security safeguards to prevent personal data breach INR 250 crore
Failure to notify the Board or affected Data Principals of a personal data breach INR 200 crore
Breach of children's data obligations INR 200 crore
Breach of Significant Data Fiduciary obligations INR 150 crore
Residual breach of other provisions of the Act or Rules INR 50 crore

Individual liability of Directors and Key Mnagerial Personnel : a board-level risk that cannot be delegated

The penalty framework under the Digital Personal Data Protection Act, 2023 attracts considerable attention because of the magnitude of the financial sanctions it authorises. However, for boards of directors and senior management, the more operationally significant question is whether individual exposure can arise at the governance level even where the statutory penalties are formally imposed on the Data Fiduciary as a corporate entity. The Act does not contain an express provision imposing direct personal penalties on directors or Key Managerial Personnel. Nevertheless, certain structural features of the framework mean that boards should approach data protection governance with the same seriousness as they would any other area of regulatory risk management.

First, the accountability architecture of the Act situates data protection governance squarely within the oversight responsibilities of the governing body of the Data Fiduciary. Section 10 requires a Significant Data Fiduciary to appoint a Data Protection Officer who is responsible to the board of directors or equivalent governing body. While this reporting structure does not itself create statutory liability for directors, it clearly embeds data protection oversight within the board's governance framework. As a result, the extent to which the organisation has implemented appropriate technical and organisational measures under section 8 will inevitably be evaluated in light of the governance mechanisms established at the senior-management and board level. For boards of entities that may be classified as Significant Data Fiduciaries, matters such as Data Protection Officer reporting lines, review of Data Protection Impact Assessments, and oversight of algorithmic or high-risk processing activities should therefore be treated as substantive governance functions rather than operational matters delegated entirely to management.

Secondly, enforcement proceedings before the Data Protection Board of India are directed against the Data Fiduciary. However, the Board's inquiry powers allow it to call for information, documents and records relevant to an alleged instance of non-compliance. In practice, this means that internal governance materials—including board minutes, compliance reports, internal communications and management approvals—may become part of the evidentiary record in a significant breach investigation. Such materials can illuminate what the organisation's leadership knew about data protection risks, when those risks were identified, and what steps were taken to address them. Even in the absence of a statutory attribution provision, the documentary record of board oversight and management decision-making may therefore assume practical significance in the regulatory assessment of organisational compliance.

Thirdly, and of particular relevance from a legal risk perspective, the DPDP framework operates alongside other Indian statutes that do contain express attribution provisions imposing personal liability on officers responsible for the conduct of a company's business. The Companies Act, 2013 includes the concept of an "officer in default," under which directors and senior officers may be held responsible where a company contravenes statutory obligations and the relevant individual was in charge of, or responsible for, the conduct of the company's affairs. Similarly, the Information Technology Act, 2000 contains provisions that can attribute liability to persons responsible for the operations of a body corporate in specified circumstances. Where the factual matrix of a data incident also engages obligations under these or other parallel legal frameworks, the question of individual responsibility may therefore arise independently of the DPDP Act's own penalty provisions.

The practical implication is that board-level engagement with data protection compliance should be demonstrable and appropriately documented. Boards would be well advised to ensure that DPDP compliance forms part of periodic governance review, that the Data Protection Officer or equivalent officer has a clear reporting channel to the board, and that material data incidents or compliance gaps are escalated with board visibility rather than managed solely within operational functions. Directors who become aware of deficiencies in data protection controls should ensure that remedial measures are formally considered and recorded. Such practices help establish a defensible governance record demonstrating that the board exercised informed oversight rather than passive delegation.

In an enforcement environment where the Data Protection Board of India is newly constituted and the contours of regulatory practice are still developing, the governance posture adopted by boards and senior management is likely to assume particular importance. While penalty notices under the Digital Personal Data Protection Act, 2023 will be addressed to the Data Fiduciary, the documented conduct of the organisation's leadership—particularly in relation to oversight, escalation and remediation—may play a meaningful role in how compliance failures are evaluated in practice.

A board-ready compliance roadmap for 2026 and 2027

The staging of the Act and Rules gives organisations a rare opportunity to build compliance in sequence. A practical programme should include at least the following.

First, lawful-basis mapping. Every material processing activity touching Indian personal data should be classified under consent, section 7 legitimate use, or exemption. Businesses should specifically revisit employee-data workflows, security monitoring, support desks, fraud prevention and government-facing processes rather than defaulting all of them to consent.

Secondly, notice and consent redesign. Product notices, website flows, app journeys and enterprise onboarding materials should be rebuilt to satisfy the statutory notice standard: plain language, itemised data description, specific purpose and accessible links for withdrawal, rights exercise and complaints.

Thirdly, processor governance. Contracts with processors should be inventoried and upgraded to include security obligations, cooperation duties, breach escalation, retention and deletion controls, and log preservation where relevant.

Fourthly, retention and erasure architecture. Organisations should define specified purposes, inactivity triggers, log-retention periods, litigation-hold exceptions and account-level erasure logic. For large consumer platforms, this is a product engineering exercise as much as a legal one.

Fifthly, children's data mapping. Businesses should identify whether any product, service or feature is child-facing or may process children's data in practice. Where relevant, they should build age-assurance logic, parental-consent mechanisms and exemption analysis under the Fourth Schedule.

Sixthly, incident-response readiness. Security and privacy teams should prepare a breach decision tree aligned to rule 7, with pre-approved templates for affected Data Principals and regulator intimation, and board-escalation thresholds that match the statutory seriousness of breach obligations.

Seventhly, SDF preparedness. High-scale businesses, major digital platforms, ad-tech operators, fintechs, health-techs and politically sensitive intermediaries should build DPO reporting lines, internal DPIA methodology and algorithmic risk review now, rather than after notification.

Conclusion

India's DPDP framework is now legally real and operationally consequential. The final Rules answer many of the implementation questions that followed enactment of the statute, but they also make the compliance task more concrete. The Indian model is not a direct transposition of foreign data protection regimes. It is a distinct architecture built around lawful purpose, consent and certain legitimate uses, accountable fiduciaries, targeted children's-data controls, a regulated consent-management layer, an institutionally central Board, and a staggered commencement design that gives businesses time but not immunity.

For legal and compliance teams, the immediate challenge is sequencing. The correct response is not to build everything at once, nor to wait until May 2027 and compress the programme into a late implementation sprint. The right response is to use 2026 for architecture: inventory the data, classify the processing, redraft the notices, fix processor contracts, design retention logic, map child-data exposure, and establish governance for high-risk or likely-SDF processing. Organisations that do that work early will enter the main commencement phase with defensible systems. Those that do not may discover that the real risk under the DPDP regime is not only the monetary penalty, but the inability to evidence disciplined compliance when the Board starts asking operational questions.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More