ARTICLE
12 December 2025

CJEU Rules On GDPR Obligations For Advertisements

BB
Baker Botts LLP

Contributor

Baker Botts is a leading global law firm. The foundation for our differentiated client support rests on our deep business acumen and technical experience built over decades of focused leadership in our sectors and practices. For more information, please visit bakerbotts.com.
The Court of Justice of the European Union, on request for a preliminary ruling from the Court of Appeal of Cluj, Romania, has issued a Grand Chamber judgment addressing the GDPR obligations of online platforms that publish user-generated advertisements ...
European Union Privacy
Nick Palmieri’s articles from Baker Botts LLP are most popular:
  • in Canada
Baker Botts LLP are most popular:
  • within Insolvency/Bankruptcy/Re-Structuring and Antitrust/Competition Law topic(s)

The Court of Justice of the European Union ("CJEU"), on request for a preliminary ruling from the Court of Appeal of Cluj, Romania, has issued a Grand Chamber judgment addressing the GDPR obligations of online platforms that publish user-generated advertisements containing personal data, including sensitive data. Although the case arose in the context of an online marketplace hosting classified ads, the Court's reasoning may have wider implications for digital advertising models that involve distribution, amplification, or monetization of user-provided content, particularly where profiling or targeting risks the processing—or inference—of special categories of data.

At its core, the Court's opinion reaffirms that the GDPR applies in full to platform operators that influence the purposes and means of publishing personal data in advertisements. The Court also rejects any suggestion that the e-Commerce Directive's liability exemptions for hosting providers dilute or displace GDPR duties. Critically, where sensitive data are at stake, the platform must design and operate its service to prevent unlawful publication ex ante, and it must adopt robust security measures to prevent downstream copying and redistribution.

The Judgment

The case concerned an online marketplace on which an anonymous user posted an advertisement containing photographs and contact details of an individual and presenting her as (falsely) offering sexual services. According to the underlying complaint, the advertisement was false, harmful, and used without consent.

The Court first held that the marketplace operator qualifies as a controller for the publication of personal data in ads on its platform where it publishes and can exploit those ads for its own commercial or advertising purposes, determines parameters for dissemination and presentation, and reserves broad rights over content. In such circumstances, the operator and the user posting the ad are joint controllers of the publication operation under the GDPR.

Next the Court held that, where advertisements may contain special-category data within the meaning of GDPR Article 9(1), the operator must, before publication and by design, implement appropriate technical and organizational measures to:

  • identify ads that contain sensitive data;
  • verify whether the advertiser is the data subject whose sensitive data are in the ad; and
  • refuse publication if the advertiser is not the data subject, unless the advertiser can demonstrate explicit consent of the data subject or another (limited) Article 9(2) exception applies.

Separately, under GDPR Article 32, the operator must implement appropriate security measures to prevent ads containing sensitive data from being copied and unlawfully republished on other sites. While GDPR does not require elimination of all risk, controllers must adopt measures calibrated to the state of the art, the sensitivity of the data, and the risk of loss of control. The e-Commerce Directive's separate liability regime (Articles 12–15) does not displace or reduce GDPR obligations. A platform cannot invoke hosting safe harbors to avoid GDPR compliance or accountability.

Key GDPR takeaways

The Court's analysis is built on three GDPR pillars:

Controllership and joint controllership. A platform is a controller where it exerts influence over purposes and means of publication beyond mere technical hosting—for example, by defining dissemination parameters, organizing presentation and classification, or reserving broad exploitation rights over ad content. The user posting the ad may also be a controller of the same operation, resulting in joint controllership with responsibilities allocated transparently.

Special-categories of data (GDPR Article 9). The Court emphasizes a broad conception of sensitive data that includes information revealing such data by inference. Publication of sensitive data is prohibited absent an Article 9(2) exception—typically explicit consent by the data subject for publication. Platforms must therefore build and operate pre-publication controls that detect sensitive-data content, verify the advertiser's identity, and gate publication upon proof of explicit consent (or another narrow exception). Allowing anonymous publication materially heightens risk and is incompatible with accountability where sensitive data are foreseeable.

Security of processing (GDPR Article 32). Once this sensitive data is online, the risk of loss of control is significantly greater than it otherwise would be. In order to address (and ideally to reduce) this risk, controllers must implement measures—technical and organizational—that are apt, given the state of the art, to prevent copying and unauthorized dissemination. The mere fact of unlawful downstream republication is not strict proof of a breach, but controllers must be able to demonstrate that their measures were appropriate to the risks.

Implications for targeted advertising and adtech practices

While the facts of the underlying case centered on user-posted classifieds, the Court's reasoning applies wherever a service operator participates in determining the purposes and means of disseminating personal data in advertising, and especially where sensitive data may be processed or revealed by inference. This has several practical implications for targeted advertising within the EU:

Controller status for publishers and platforms. Publishers, marketplaces, social networks, and ad-funded services that organize, amplify, or monetize ads are likely controllers for the publication layer, not passive hosts. If they set dissemination parameters, structure content, or reserve exploitation rights, they should expect joint controllership at minimum with the ad originator for the publication operation, with corresponding accountability duties.

Sensitive-data inferences in targeting. Many targeting strategies (interests, lookalikes, audience segments, contextual categories) can reveal, or be used to infer, special-category data—for example, a user's sexual orientation, health status, political views, or religious beliefs. The Court's broad approach to "revealing" sensitive data underscores the risk that targeted advertising may trigger Article 9 even absent explicit user-provided fields. Where targeting reveals or hinges on sensitive data, explicit consent from the data subject specific to that purpose is generally required, unless another narrow Article 9(2) ground applies.

Pre-publication and pre-targeting controls by design. The obligation to identify sensitive data and verify identity/consent before publication translates, in targeting contexts, into upstream product and workflow controls. Platforms should design screening to flag sensitive ad content and categories; impose advertiser identity verification; require and validate explicit consent where sensitive data are implicated; and deny dissemination where proof is lacking. "By default" configurations should avoid exposing personal data to indefinite audiences absent a lawful basis.

Legitimate interests and contract theories. The decision reinforces that lawful bases must be interpreted restrictively and that Article 9 data demands an exception independent of general Article 6 bases. Profiling or targeting that reveals special-category data is unlikely to be justified by legitimate interests, nor by performance of a contract absent strict necessity. Where sensitive data are foreseeable in targeting or audience segmentation, explicit consent tailored to that purpose will typically be required.

Security measures for downstream dissemination. Controllers must adopt measures suited to the state of the art to deter copying and unauthorized redistribution of sensitive-data advertising content. While this case focuses on creative content in ads, the same logic applies to ad metadata and audience signals revealing sensitive data. Controllers should assess and document preventive measures, considering watermarking, rate-limiting, access controls, contractual and technical restrictions in syndication, and rapid takedown mechanisms.

Safe-harbor limits. Hosting safe harbors under the e-Commerce Directive do not mitigate GDPR duties. Platforms cannot rely on the absence of a general monitoring obligation to avoid implementing targeted, risk-based controls required by GDPR, particularly for sensitive data and by-design obligations.

Next Steps for Organizations

Controllers involved in advertising publication, targeting, or distribution within the EU should reassess their GDPR compliance posture with the Court's reasoning in mind. In practical terms, this means:

  • Mapping controllership and joint controllership at the publication and dissemination layers, with transparent allocation of responsibilities and contact points.
  • Reviewing targeting taxonomies, audience segments, and contextual categories for risk of revealing or inferring sensitive data; where such risk exists, implementing explicit-consent pathways and gating dissemination accordingly.
  • Implementing by-design pre-publication detection and verification workflows for sensitive ad content, including identity verification of advertisers and verification of explicit consent where the advertiser is not the data subject.
  • Enhancing security controls to prevent copying and unlawful redistribution of ad content and signals that reveal sensitive data, calibrated to state of the art and documented through DPIAs and risk assessments.
  • Revisiting reliance on legitimate interests or contractual necessity for ad personalization where sensitive-data inferences may occur, pivoting to consent where required.

The decision signals a clear expectation that platforms and publishers engaged in ad dissemination and targeting (whether internally or externally created) must proactively prevent unlawful processing before it occurs, particularly when there is a foreseeable risk that ads or audience strategies will disclose or infer special-category data. Accountability, by design and by default, is not optional—and safe harbors for hosting do not narrow separate GDPR obligations.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

[View Source]

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More