- within Privacy topic(s)
- with readers working within the Pharmaceuticals & BioTech and Construction & Engineering industries
- within Transport, Media, Telecoms, IT, Entertainment and Family and Matrimonial topic(s)
- with Senior Company Executives, HR and Inhouse Counsel
On 13 February 2026, the French Conseil d'État rejected the appeals filed by the companies GERS, Santestat (absorbed by GERS) and Cegedim Santé — all part of the Cegedim group — against CNIL sanctions of €800,000 (GERS), €200,000 (Santestat) and €800,000 (Cegedim Santé) respectively.
The case arose from the above series of sanctions imposed by the CNIL in 2024 following extensive investigations into the group's processing of health‑related datasets. The Cegedim entities involved operated two large databases used to generate statistical insights for the healthcare sector. Cegedim maintained that the data in question had been rendered anonymous through pseudonymisation techniques.
The key findings of the court can be summarised as follows:
- Pseudonymised ≠ Anonymous — Re-identification Risk Was Not Insignificant: The Conseil d'État confirmed that the data processed by these companies, although pseudonymised, constituted personal data within the meaning of the GDPR. The re-identification risk was "not insignificant" because: (i) the datasets contained rich quasi-identifiers: age, sex, pathologies, prescriptions, precise consultation dates; (ii) healthcare professional identifiers were included; and (iii) a unique patient identifier made it possible to trace care pathways and single out patients (individualiser des patients) using reasonably available means.
- Unlawful Processing of Health Data Without CNIL Authorisation: Because the data were personal (and sensitive — health data under Art. 9 GDPR), their processing without the data subjects' consent required prior CNIL authorisation under Article 66 of the French Loi Informatique et Libertés (6 January 1978), which was never obtained.
- Illicit Automated Collection via the HRi Teleservice: Cegedim Santé's software "Crossway" automatically downloaded data from the French social security teleservice HRi (Historique des Remboursements Individualisés) whenever a participating doctor merely consulted it, without any option for simple viewing. This was held to breach the principle of lawfulness under Article 5(1)(a) GDPR.
- Proportionate Sanctions: The fines and publicity measures (publication of the Cegedim Santé decision) were confirmed as proportionate.
The rest of this blog post looks in more detail at the differences between this judgment and the recent CJEU judgment regarding pseudonymisation, as well as the practical implications for business and corporate transactions.
Conseil d'Etat vs Court of Justice of the European Union
In its landmark judgment of 4 September 2025 (EDPS v. SRB), the Court if Justice of the European Union ("CJEU") established a relative, recipient-perspective approach to the qualification of pseudonymised data. It found that:
- Pseudonymised data is NOT automatically personal data in all cases and for every person.
- The same dataset may be personal data for the transferor (who holds the re-identification key) but anonymous data for a recipient who has no reasonable means of re-identifying individuals.
- Any assessment must consider "all the means reasonably likely to be used" to re-identify — legal barriers, contractual restrictions, technical measures.
- However, the transferor's GDPR obligations persist: transparency requirements (Art. 13/14 GDPR) must be assessed from the controller's perspective at the time of collection, not the recipient's. Similarly, the privacy notice must disclose recipients, even if the data will be anonymous in their hands.
At first glance, the CJEU SRB ruling and the Conseil d'État GERS decision might appear to pull in opposite directions. In reality, they are complementary, as demonstrated in the following table:
| CJEU, SRB (Sept. 2025) | Conseil d'Etat, GERS (Feb. 2026) | |
| Core question | Can pseudonymised data be anonymous for the recipient? | Are these specific pseudonymised datasets anonymous? |
| Test applied | Relative — perspective of the recipient | Relative — but applying the "reasonably available means" test rigorously |
| Outcome | Yes, pseudonymised data can fall outside GDPR for the recipient if re-identification is not reasonably likely | No, these specific datasets remained personal data because re-identification risk was not insignificant |
| Why | SRB shared alphanumeric codes with Deloitte, which had no key, no access, no legal means to re-identify | GERS/Cegedim held rich quasi-identifiers (care pathways, prescriptions, practitioner IDs, precise dates) enabling singling-out |
Key Takeaway: The CJEU SRB/SER Ruling Is Not a Licence to Pseudonymise and Forget
The Conseil d'Etat GERS decision demonstrates in concrete terms what the CJEU's "relative" test looks like when applied rigorously:
- Depth and granularity of data matter enormously. Rich health datasets with multiple quasi-identifiers (age + pathology + prescription + date + practitioner) create combinatorial re-identification risk that mere pseudonymisation cannot neutralise.
- The "unique identifier" trap. A single pseudonymous patient ID that links all records over time enables the reconstruction of full care pathways — this is singling out, one of the three criteria from the Article 29 Working Party's Opinion 05/2014 on anonymisation techniques (alongside linkability and inference).
- The "means reasonably likely" test is demanding. The Conseil d'État applied the SRB framework but found that the richness of the data, combined with the possibility of cross-referencing with external datasets, made re-identification reasonably likely.
Practical implications – Use Cases
1. M&A Transactions and Data Rooms
In M&A due diligence, personal data is routinely shared in virtual data rooms (VDRs). The GERS/SRB tandem provides critical guidance:
- Seller's obligations persist: Even if pseudonymised data is shared with potential buyers (who may not hold the re-identification key), the seller remains a controller and must comply with GDPR — privacy notices, DPIA, legal basis for sharing.
- The "anonymous in the buyer's hands" argument is fragile if the data room contains rich datasets (employee data with job title + department + seniority; customer data with purchase history + location + demographics). Cross-referencing with publicly available information (LinkedIn, company registries) may make re-identification reasonably likely.
- Actionable step: Conduct a documented re-identification risk assessment before populating the data room. Consider genuine aggregation or statistical summaries rather than pseudonymised individual records. Include contractual anti-re-identification clauses in NDA/data room rules.
2. Health Data and Clinical Research
The GERS ruling is a direct warning for the medtech and pharma sectors:
- Observatories and RWE (Real-World Evidence) databases that rely on pseudonymised patient data from practitioners must undergo a rigorous anonymisation assessment — or secure CNIL authorisation (CNIL referentials, MR, or specific authorisation).
- The EHDS (European Health Data Space) regulation will not change this: secondary use of health data will still require appropriate safeguards.
3. AI Training and Data Sharing Agreements
Companies sharing pseudonymised datasets for AI/ML training must assess whether the receiving party can reasonably re-identify individuals — factoring in the recipient's own datasets, technical capabilities, and legal access.
The SRB ruling provides a pathway: if properly pseudonymised and the recipient genuinely lacks re-identification means, the data may be outside GDPR scope for the recipient. But GERS shows this pathway has high evidentiary requirements.
4. International Data Transfers
The SRB relative approach could reduce GDPR friction for cross-border transfers of properly pseudonymised data to non-EU recipients — if the recipient cannot reasonably re-identify, the data may not be "personal data" triggering Chapter V transfer restrictions.
But again, the transferor's obligations remain (transparency, DPIA, documentation, etc.).
5. Intra-Group Data Sharing
Group companies sharing pseudonymised client/customer data between entities (e.g., for analytics, product development, loyalty programmes) cannot assume the data is "anonymous" simply because identifiers have been replaced — especially if multiple group entities can pool their datasets to reconstruct individual profiles.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.