- with Inhouse Counsel
UK Data Protection changes come into force
Whilst the Data (Use and Access) Act 2025 ("DUA Act 2025") received Royal Assent back in June 2025, the most significant changes under the UK's long-awaited data protection and e-privacy "reform" come into force on 5 February 2026 - marking the most meaningful shift in this area since Brexit.
The changes include:
- a shift from prohibiting automated decision making, to permitting it subject to certain rules and safeguards;
- an increase in enforcement risk for e-privacy breaches from £500,000 to the higher of £17.5 million or 4% of global annual turnover, to align with the UK GDPR;
- a list of "recognised" legitimate interests, where the full balancing test is not required but necessity of decisions taken to process data for these purposes should be recorded;
- a new "data protection test" for both controllers and the UK Government to apply when considering whether a data importer has appropriate safeguards in place to protect personal data;
- the introduction of a new complaints regime, requiring controllers to be the first line in dealing with data protection complaints; and
- a codification of market practice (for example a new regime around complaints by data subjects which comes into force in June 2026).
Further details of the changes can be found in our blog post, available here.
ECJ rules that EDPB decisions can be challenged
In a key judgment handed down on 10 February 2026, the European Court of Justice ("CJEU") has ruled that companies may seek annulment of binding decisions issued by the European Data Protection Board ("EDPB") under Article 65 GDPR.
In December 2018, the Irish Data Protection Commission ("DPC") initiated an investigation into WhatsApp Ireland, a data controller, regarding compliance with transparency obligations under the GDPR. A draft decision was circulated to other supervisory authorities, but as no consensus was reached on aspects of that draft, the matter was referred to the EDPB. The EDPB's Binding Decision 1/2021 required the DPC to amend its proposed corrective measures, ultimately leading to an increased €225 million fine for WhatsApp. WhatsApp sought annulment of the EDPB decision before the General Court, which dismissed the action. WhatsApp subsequently appealed to the CJEU.
In setting aside the General Court's Order, the CJEU has confirmed that binding decisions of the EDPB may be challenged. As such, entities subject to GDPR may be able to directly challenge such decisions before the EU courts, including where those decisions are implemented by the relevant supervisory authority. The ruling is of significance for companies facing GDPR fines, who may now more readily litigate their own penalties.
Capita data claim lives to fight another day
A recent judgment in the Capita case has addressed an important procedural challenge in large‑scale data breach litigation, concerning the circumstances in which group claims may be struck out as an abuse of process. It provides clear guidance on the proper role of pleadings, the use of generic allegations of distress, and the high threshold for strike‑out applications.
The judgment concerned Capita’s application to strike out, or obtain reverse summary judgment on, claims brought by 3,973 claimants following a data incident. Capita argued that the claims were an abuse of process under CPR 3.4(2)(b) because the claimants’ solicitors had allegedly “tainted” the claimants’ evidence on distress by advancing generic assertions in the Particulars of Claim and by using a structured questionnaire.
Master Dagnall dismissed the application in full. He held that the allegation of taint was entirely speculative and that no abuse of process had been established. The court emphasised that pleadings may legitimately be expressed in lawyers’ language. The merits of the claims will be determined on the evidence at trial, not on the wording of the pleadings. The use of a generic framework for pleading distress was not impermissible where each claimant had voluntarily signed up to the claim and assented to the pleaded allegations as applying to them. The fact that claimants might not have brought individual proceedings without a group action was irrelevant.
Master Dagnall also noted that strike‑out is a draconian, last‑resort remedy. Had there been concerns about exaggeration or dishonesty, these were matters for cross‑examination, not procedural disposal. The judge was critical of the parties’ correspondence becoming focused on solicitors’ conduct rather than the substantive merits, and he rejected attempts to rely on criticisms made in other cases.
DSG Retail vs ICO - Court of Appeal broadens data security obligations
On 19 February 2026, the Court of Appeal found in favour of the ICO in its challenge to the Upper Tribunal’s ruling in the long running dispute with DSG Retail Limited.
The dispute arose after a cyber attack in which criminals accessed payment card data on DSG’s systems. The ICO had issued a £500,000 penalty under the Data Protection Act 1998 ("DPA 1998"), although the First tier Tribunal later reduced that amount. The Upper Tribunal then went further by ruling that the stolen card numbers and expiry dates did not amount to personal data because the attackers could not use those details to identify individual customers. In its view, information of this kind becomes personal data only if it can be matched with additional identifying information held by DSG or another entity. If no such link can be made and the information remains effectively anonymous, the acquisition of that information would not amount to unauthorised processing of personal data.
The ICO argued that this analysis was flawed. It said the Tribunal had taken an overly narrow approach to the concept of personal data and had overlooked the fact that pseudonymised information is still regulated. Under the seventh data protection principle of the DPA 1998, organisations must adopt appropriate technical and organisational measures to secure personal data, even where that data has been partially masked. The ICO warned that the Tribunal’s interpretation risked weakening important protections by suggesting that security duties disappear whenever an attacker cannot re-identify individuals.
The appeal therefore focused on whether payment card information, even when obtained in an incomplete form, still falls within the scope of personal data; whether organisations remain responsible for securing it; and whether the test for whether data was 'personal data' was determined from the perspective of the controller, or from the perspective of the hacker.
The Court of Appeal agreed with the ICO. It held that the correct test is whether the controller can identify the individuals to whom the data relates. It is irrelevant that a hacker might not be able to do so. If the controller can link the information back to real people, the data retains its status as personal data. Pseudonymisation does not reduce the controller’s legal duties. This approach is consistent with European case law and reinforces that the duty to keep data secure arises from the relationship between the controller and the data subject and not from the actions of an unauthorised third party.
The decision confirms that even partial datasets, including card numbers and expiry dates, attract full security obligations where the controller is capable of identifying the individual behind the information. The matter has now been returned to the First Tier Tribunal, which will consider the appropriate penalty in light of the clarified legal position.
Digital Omnibus: EDPB and EDPS Key concerns
On 11 February 2026, the European Data Protection Board ("EDPB") and the European Data Protection Supervisor ("EDPS") published the Joint Opinion 2/2026 ("Joint Opinion") on the European Commission’s Digital Omnibus Package ("Digital Omnibus").
The Digital Omnibus aims to streamline the EU data protection rulebook, reduce administrative burden and improve the competitiveness of organisations. It proposes amendments to a broad range of legislation, including the GDPR and the ePrivacy Directive. See our blog post here for further information.
The Joint Opinion assesses whether the Digital Omnibus:
- delivers genuine simplification and facilitates compliance;
- improves legal certainty; and
- preserves the protection of individuals’ fundamental rights.
While welcoming the overall objective of simplification, the EDPB and EDPS conclude that several aspects of the proposal raise material concerns. At a high level, the regulators identify three recurring risks:
- a reduction in the level of protection afforded to individuals;
- the introduction of new legal uncertainty; and
- changes that may make EU data protection legislation harder to apply in practice.
The Joint Opinion sends a clear message to EU legislators: simplification of the digital regulatory framework is welcome, but changes affecting core concepts will face close scrutiny. The Digital Omnibus is now continuing its passage through the legislative process, and so we must wait and see the extent to which it is amended to reflect the concerns raised in the Joint Opinion.
International Data Protection Authorities issue joint statement on privacy risks of AI-generated imagery
On 23 February 2026, data protection authorities around the world published a joint statement on AI-generated imagery and the importance of protection of privacy. The joint statement, which represented the shared opinions of 61 authorities, addressed the growing concerns surrounding AI-generated images depicting identifiable individuals without their knowledge and consent. Whilst the joint statement acknowledged the benefits of recent technological advancements in AI, it was noted that this advancement is accompanied by several risks, such as the creation of non‑consensual intimate imagery, defamatory material, and other misleading or harmful depictions. These risks are heightened amongst children and other vulnerable groups.
The authorities outlined key principles that should underpin legislation, including the implementation of robust safeguards, clear transparency on AI capabilities and acceptable use, effective removal procedures for AI-generated images and enhanced protections for children.
The authorities stressed that the harms arising from non‑consensual or otherwise harmful AI‑generated content are significant and warrant urgent regulatory attention. The co‑signatories emphasised their commitment to coordinated international engagement, including information‑sharing on enforcement, policy development, and education, to the extent permitted by applicable laws.
The statement concluded with a call for organisations to engage proactively with regulators and to embed privacy‑protective safeguards from the outset. As AI technologies continue to evolve, the authorities state that innovation must not come at the expense of privacy, dignity, safety, and other fundamental rights—particularly those of the most vulnerable members of society.
AG Opinion discusses privacy and personal data access in competition
In an opinion delivered on 26 February 2026, the Advocate General, Athanasios Rantos, provided guidance on the scope of the European Commission’s powers to request information containing personal data in competition investigations.
This opinion was provided in the context of appeals against information‑request decisions, arising from two Commission investigations into an organisation's conduct. The Commission had required the organisation to produce a large volume of internal documents identified through electronic searches using multiple search terms applied to the accounts of senior employees over several years. The decisions were challenged on the basis that the requests were unnecessary and disproportionate, and that they infringed the right to respect for private life because they captured documents containing personal data, including so‑called “mixed documents” combining business and personal information.
The General Court dismissed the appeals, and the case was appealed to the CJEU. The Advocate General opinion has proposed that both appeals be dismissed, largely endorsing the General Court’s approach, confirming that:
- under Article 18 of Regulation 1/2003, the Commission may request all information it can reasonably suppose will help it determine whether a suspected infringement has taken place; and
- necessity is assessed by reference to whether the Commission could reasonably expect the information to be useful at the time of the request, not by reference to the proportion of documents ultimately found to be relevant.
He also rejected the argument that the Commission was required to demonstrate that a more targeted or less burdensome approach was available.
The AG accepted that the Commission’s processing of personal data is inherent in competition enforcement and is lawful where necessary for tasks carried out in the public interest. While disclosure of documents may interfere with the right to respect for private life under Article 7 of the Charter, such interference can be justified if it meets the conditions in Article 52(1), including proportionality.
Crucially, the AG drew a distinction between special category personal data and other personal data contained in mixed documents. Only the former requires enhanced safeguards such as a virtual data room. Mixed documents that do not contain special category personal data may still be accessed, subject to existing protections, including professional secrecy obligations binding Commission officials.
The Opinion provides reassurance to competition authorities that large‑scale electronic searches using broad terms are not inherently unlawful, even where they capture personal data. At the same time, it clarifies that EU data protection law and fundamental rights apply, but should not prevent the Commission from accessing mixed business documents where this is necessary and proportionate for enforcement purposes.
EDPB reports on consultation for EI-wide DP templates
On 12 February 2026, the European Data Protection Board ("EDPB") announced the outcome of its public consultation to identify which templates would most help organisations, including, in particular, small and medium enterprises, to ensure GDPR compliance. This consultation follows the Helsinki Statement commitment to improving clarity and support for organisations subject to the GDPR.
The report summarises the results of the consultation, which saw 82 written contributions from a range of stakeholders including business associations, public authorities, NGOs, and individuals. Based on feedback, the EDPB has confirmed that it will develop the following three templates as part of its Work Programme for 2026-2027: (i) a legitimate interest assessment template; (ii) a record of processing activities template; and (iii) a privacy notice template. Whilst the development of such templates could be seen by controllers as a welcome guidance and step towards consistency, there is likely to also be hesitancy until we see what the templates look like. If the proposed templates are overly complex or burdensome, controllers might not welcome a working assumption that they should be followed.
French Supreme Court weighs in on pseudonymisation vs anonymisation
After the CJEU judgment last year regarding the impact of pseudonmyisation on personal data (see here for our blog post on that judgment), the French Supreme Court issued its own interesting ruling on the issue in February 2026.
The ruling made the following key findings:
- Pseudonymised ≠ Anonymous — Re-identification Risk Was Not Insignificant: The Conseil d’État confirmed that the data processed by the companies, although pseudonymised, constituted personal data within the meaning of the GDPR. The re-identification risk was “not insignificant” because: (i) the datasets contained rich quasi-identifiers: age, sex, pathologies, prescriptions, precise consultation dates; (ii) healthcare professional identifiers were included; and (iii) a unique patient identifier made it possible to trace care pathways and single out patients (individualiser des patients) using reasonably available means.
- Unlawful Processing of Health Data Without CNIL Authorisation: Because the data were personal (and sensitive — health data under Art. 9 GDPR), their processing without the data subjects’ consent required prior CNIL authorisation under Article 66 of the French Loi Informatique et Libertés (6 January 1978), which was never obtained.
- Illicit Automated Collection via the HRi Teleservice: Cegedim Santé’s software “Crossway” automatically downloaded data from the French social security teleservice HRi (Historique des Remboursements Individualisés) whenever a participating doctor merely consulted it, without any option for simple viewing. This was held to breach the principle of lawfulness under Article 5(1)(a) GDPR.
- Proportionate Sanctions: The fines and publicity measures (publication of the Cegedim Santé decision) were confirmed as proportionate.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.
[View Source]