- in United Kingdom
- with readers working within the Business & Consumer Services industries
Less red tape and a simpler path to compliance - particularly for smaller businesses. Fostering innovation, AI, and data-driven growth, while not compromising fundamental rights. These are the European Commission's aims in proposing a set of sweeping reforms under the "Digital Omnibus" on 19 November 2025: two draft regulations (one for AI, one for data), along with a suite of other measures to consolidate and simplify the EU's digital rulebook. This briefing looks at how the proposed changes to the AI Act, the GDPR, Data Act, e-privacy rules, and cyber incident reporting would impact businesses.
1 Why the change?
The last 3 to 4 years have seen a rapidly expanding body of digital regulation from the EU - much of it ground-breaking but sometimes unclear in its interplay, difficult to implement in practice, and often subject to aggressive timescales. This complexity can act as a brake on growth and innovation, as well as adding significant compliance costs and uncertainty, particularly when national authorities interpret EU law inconsistently.
The Digital Omnibus is the Commission's answer. It projects annual savings of at least €1 billion, a further €1 billion in one-off cost reductions, and a total of at least €5 billion over 3 years by 2029.
2 Impact on the AI Act
The key proposed changes to the AI Act of interest to businesses include:
- Delay to the August 2026 deadline for high-risk systems. The timetable so far for implementation of the EU AI Act has been aggressive, particularly as formal guidance has been delayed and only emerged shortly before deadlines. In recognition of this, the Commission proposes to link the deadline for complying with high-risk AI system obligations to the availability of Commission guidance and harmonised standards. Once the guidance becomes available, the rules for high-risk AI systems would start to apply after a transition period - a six-month period for systems falling within Annex III (e.g. certain biometric identification, employment, credit-scoring use cases, etc) and twelve months for new Annex I safety systems (e.g. critical infrastructure, medical devices etc). There would be backstop deadlines of 2 December 2027 for Annex III systems and 2 August 2028 for Annex I systems.
- Relaxation of the AI literacy requirement so that it ceases to be an obligation on providers and deployers and becomes a requirement only on the Commission and member states to "encourage" AI literacy. Training obligations would only remain for those operating high-risk AI systems.
- Relaxations for SMEs extended to small mid-caps (SMCs) e.g. simplified technical documentation, proportionate penalties, and easier quality management systems.
- Exemption from the registration requirement for AI systems used in high-risk areas for narrow or procedural tasks.
3 Impact on GDPR and e-Privacy
Streamlined incident reporting
One of the most welcome changes would be the creation of a single breach and incident reporting channel, which applies the "report once, share many" model across GDPR, the NIS2 Directive, DORA, and the Critical Entities Resilience Directive. Companies would be able to notify relevant authorities via one entry point, using a single form, reducing the administrative burden and cost of multiple filings.
The threshold for reporting would also be raised - only incidents posing a high risk to individuals would need to be notified, and the timeframe for reporting would be extended to 96 hours.
A definition of "personal data" that's easier to apply in practice
An aspect of GDPR that can make the rules particularly tough to apply in practice is an "objective"/"absolute" interpretation of the definition of "personal data". According to this interpretation, even if a controller cannot itself identify a data subject from the information it processes, the data must still be treated as personal data if anyone else is capable of re-identifying the individual with reasonable means.
The UK's Information Commissioner's Office and recent decisions of the Court of Justice of the EU have already started to move away from this absolute interpretation. The Commission proposes to codify this shift, so that the information would only be personal data for an entity if, taking into account all means reasonably likely to be used, that particular entity can identify the individual from the data, either alone or in combination with other accessible information.
For example, under the new proposal:
If Company A receives a dataset where the names are replaced by unique codes and does not possess (nor reasonably could obtain) the mapping between codes and individuals, it does not need to treat this data as "personal data" under GDPR. If Company B - perhaps the originator - holds the key and could easily re-identify the individual, it remains personal data for Company B, but not for Company A.
This reduces compliance pressures for businesses sharing or holding de-identified data and, for example, makes it easier to train AI on de-identified data.
Some relief from weaponised DSARs
Businesses long frustrated by data subject access requests (DSARs) used for reasons unrelated to data protection - such as gaining leverage in employment disputes - are likely to welcome the Commission's proposal to allow businesses to refuse or charge for DSARs that are manifestly excessive or abusive in purpose. This would put businesses in a stronger position to fend off vexatious requests.
Simpler cookie and privacy consent compliance
The Omnibus seeks to overhaul the approach to cookies and tracking technologies. Consent would not be needed for aggregated audience measurement or security purposes. Where personal data are involved, GDPR would exclusively govern and all available lawful grounds, including legitimate interests, would be available as a basis for subsequent processing.
Ultimately (2 to 4 years after the Regulation enters into force), the Commission wants to phase out intrusive cookie pop-ups, moving to browser settings that signal users' preferences automatically.
More flexibility around special category data
The Omnibus proposes two new derogations for processing special category data:
- Use of biometric data for identity verification when the data or means needed for verification are under the sole control of the data subject. This allows for compliant on-device authentication and similar applications.
- The use of special category data in AI development/operation, subject to certain conditions. These conditions require the implementation of appropriate technical and organisational measures to minimise and remove such data. If the data's removal would require disproportionate effort, strong technical controls are required to prevent the data forming part of outputs or otherwise being disclosed.
...plus, an express legitimate interest legal basis to make AI model training and operation (a little) easier
Using personal data for training and operating models would be explicitly recognised as a legitimate interest. However, it would still be subject to a balancing test and appropriate technical safeguards to minimise data, prevent disclosure of residually stored data and enable the data subject to object to processing.
Standardised Data Protection Impact Assessments (DPIAs)
The European Data Protection Board (EDPB) would provide EU-wide lists for when DPIAs are and are not required, along with a common methodology and template. This would help to bring clarity and uniformity to risk assessment obligations.
4 Impact on the Data Act
While there are proposed changes to the Data Act, they don't affect core data sharing obligations. As such, they are unlikely materially to alleviate most businesses' day-to-day compliance efforts (mid-cap companies aside, as they would benefit from relaxations that currently only apply to SMEs).
That said, the rules that require the disclosure of trade secrets are particularly problematic for some businesses. In addition to the existing ground for non-disclosure of trade secrets that would result in serious economic damage to a business, the Commission proposes to enable businesses to refuse to share data revealing trade secrets where there's a high risk of transfer to third countries with weak protections (but still requires businesses to notify the user and authorities to benefit from this carve-out).
Business-to-government (B2G) data access would be limited to public emergencies, instead of the vaguer criteria of "exceptional need".
In relation to businesses providing cloud services, the regime for switching between services would be eased for customised services, and those provided by SMEs and mid-caps, in relation to contracts pre-dating September 2025.
5 What Next?
These proposals are in draft only and there's a long journey ahead for them to become law, requiring agreement to be reached with the European Parliament and Council of the EU. It is a journey that looks likely to be tumultuous - privacy lobbyists are unsurprisingly unimpressed, accusing the Commission of ceding to pressure from the US.
Meanwhile, the Commission is already looking at further simplification efforts and is undertaking a 'stress-test' of the digital rules, also known as the "Digital Fitness Check", to study the interplay between the different rules and their cumulative impact on businesses.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.