ARTICLE
2 October 2025

Fasken's Noteworthy News: Privacy & Cybersecurity In Canada, The US, And The EU (September 2025)

F
Fasken

Contributor

Fasken is a leading international law firm with more than 700 lawyers and 10 offices on four continents. Clients rely on us for practical, innovative and cost-effective legal services. We solve the most complex business and litigation challenges, providing exceptional value and putting clients at the centre of all we do. For additional information, please visit the Firm’s website at fasken.com.
The Office of the Information and Privacy Commissioner of Ontario has issued the first-ever administrative monetary penalties (AMPs) under the province's health privacy law (PHIPA)...
Canada Privacy
Julie Uzan-Naulin’s articles from Fasken are most popular:
  • within Privacy topic(s)
  • with Senior Company Executives, HR and Finance and Tax Executives
  • in Canada
  • with readers working within the Banking & Credit and Insurance industries

Privacy & Cybersecurity in Canada, the US and the EU

This is a monthly bulletin published by the Privacy and Cybersecurity Group at Fasken with noteworthy news and updates. If you have any questions about the items in this bulletin, please contact any member of the Privacy and Cybersecurity Group and we will be pleased to assist.

Canada

Ontario Privacy Commissioner Imposes First-Ever Administrative Monetary Penalty

The Office of the Information and Privacy Commissioner of Ontario has issued the first-ever administrative monetary penalties (AMPs) under the province's health privacy law (PHIPA), and the first AMPs ever imposed by a privacy commissioner in Canada. In PHIPA Decision 298, the commissioner imposed a $5,000 AMP against a physician for improperly accessing patient records through a shared hospital electronic health record system and using them to identify newborn males to solicit parents for circumcision services. The commissioner also imposed a $7,500 AMP against the physician's private clinic for failing to meet its basic obligations under PHIPA. The investigation was triggered by a breach report filed by the hospitals. The commissioner issued a number of recommendations, emphasizing the importance of having robust privacy and information management practices in place.

Ontario Privacy Commissioner Issues a Complaint Report Against a University for Its Use of Face Detection Technology

In complaint report PX24-00001, the Ontario privacy commissioner found that a university had not complied with the province's public sector privacy law (FIPPA) by installing "smart" snack vending machines that used face detection camera technology without notifying users or obtaining proper consent. Although the university had some contractual safeguards in place with the technology vendor, it was unaware that face detection technology was gathering personal information; this stemmed largely from deficiencies in procurement, including a failure to conduct a privacy impact assessment or require such disclosures from vendors.

The commissioner recommended that the university review its privacy policies to ensure that any future collection of personal information complies with FIPPA, and take adequate steps in its procurement process to ensure it evaluates third-party service providers and any new technology to be used.

Federal Privacy Commissioner Guidance for Processing Biometrics

On August 11, 2025, the Office of the Privacy Commissioner of Canada released guidance on processing biometrics for private sector businesses and federal institutions. The guidance for businesses is structured around the key principles found in the Personal Information Protection and Electronic Documents Act (e.g., identifying appropriate purposes, obtaining consent, and limiting collection, use, disclosure, and retention), and outlines, for each principle, what businesses must and should do in respect of their handling of biometric information. The guidance for federal institutions is similarly structured around key principles set out in the federal Privacy Act.

In both cases, the guidance provides technical definitions related to biometric technology and common use cases (e.g., physiological and behavioural biometrics, which may be used for verification, identification, or classification purposes). Businesses and institutions that handle biometric information should be aware that biometric information is sensitive information when it can uniquely identify an individual; pose a high risk of harm to individuals where misused; or reveal other sensitive information such as medical information. This applies regardless of the context, because biometric information is stable over time, difficult to change, and innately linked with an individual's identity.

Europe

The EU General Court Dismisses an Action for Annulment of the New Framework for the Transfer of Personal Data Between the European Union and the United States

This confirms that, as of the contested decision's adoption date, the United States provided adequate protection for personal data transferred from the EU to organizations there for the following reasons (in French only):

  • The Data Protection Review Court (DPRC) is impartial and independent: the appointment of judges to the DPRC and the DPRC's functioning, are accompanied by several safeguards to ensure the independence of its members. Moreover, judges of the DPRC may be dismissed only by the Attorney General and only for cause, and the Attorney General and intelligence agencies may not hinder or improperly influence their work.
  • It cannot be considered that the bulk collection of personal data by American intelligence agencies falls short of the requirements arising from Schrems II, or that it fails to ensure a level of legal protection that is essentially equivalent to that guaranteed by EU law.

The Court of Justice Clarifies the Scope of the Concept of Personal Data in the Context of a Transfer of Pseudonymised Data to Third Parties

In its decision dated September 4, 2025, C‑413/23 P, the Court considers that pseudonymized data should not be regarded as personal data in all circumstances and for all persons, insofar as pseudonymization may, depending on the circumstances of the case, effectively prevent persons other than the controller from identifying the data subject in such a way that, for them, the data subject is not or is no longer identifiable.

First, the Court notes that pseudonymization is not an element of the definition of "personal data," but refers to the implementation of technical and organizational measures to reduce the risk of correlating a set of data with the identity of the data subjects.

Then it emphasizes that one of the main purposes of pseudonymization is to prevent the data subject from being identified solely on the basis of the pseudonymized data. Indeed, provided that the technical and organizational measures required for pseudonymization are effectively implemented and are such as to prevent the data in question from being attributed to the data subject, so that the latter is not or is no longer identifiable, pseudonymization may have an impact on the personal nature of such data. In this regard, the Court specifies that the controller who has carried out the pseudonymization has, in this case, additional information enabling the information transmitted to the processor to be attributed to the data subject. In other words, for the controller, this information remains personal. On the other hand, with regard to the processor, technical and organizational measures may have the effect that this information is not personal in nature if the processor is unable to re-identify the data subject.

A Draft Adequacy Decision for Brazil

The Commission has determined on September 5, 2025 that Brazil ensures an adequate level of data protection – comparable to that of the EU.

Once adopted, the decision would allow for free data flows for businesses, public authorities, and research projects between the EU and Brazil, without the need to carry out a transfer risk assessment or enter into standard contractual clauses. The Brazilian authorities have also initiated a process to adopt an equivalent decision to allow for Brazilian data to flow freely to the EU.

Guidelines 3/2025 on the Interplay Between the DSA and the GDPR

During its September plenary meeting, the European Data Protection Board (EDPB) adopted guidelines on the interplay between the Digital Services Act (DSA) and the General Data Protection Regulation (GDPR). The DSA aims to complement the rules of the GDPR to ensure the highest level of protection of fundamental rights in the digital space. Its main goal is to create a safer online environment in which the fundamental rights of all users, including minors, are protected. It applies to online intermediary services, such as very large online platforms (VLOPs) and very large online search engines (VLOSEs). Several provisions included in the DSA entail the processing of personal data by intermediary service providers and there are a number of provisions which relate to the GDPR, such as rules that refer to 'profiling' and 'special categories of data'.

The EDPB guidelines help clarify how the GDPR should be applied in the context of DSA obligations.

The guidelines will be subject to public consultation, providing stakeholders with the opportunity to comment and provide feedback.

Commission Publishes Frequently Asked Questions About the Data Act

The Frequently Asked Questions (FAQs) on the Data Act help organizations to implement it. As a reminder, the Data Act establishes a horizontal set of rules on data access and use that respects the protection of fundamental rights and delivers wide-ranging benefits for the European economy and society. It increases data availability, particularly industrial data, and encourages data-driven innovation while ensuring fairness in the allocation of data value among all actors in the data economy.

United States

California Advances Bill to Regulate Companion Chatbots

With bipartisan support, the California legislature approved a bill that would place new safeguards on artificial intelligence-powered chatbots to better protect children and other vulnerable users. Senate Bill 243 passed both the State Assembly and Senate and now heads to the state Governor's desk.

The Bill would require companies that operate chatbots marketed as "companions" — AI systems that provide adaptive, human-like responses and are capable of meeting a user's social needs — to avoid exposing minors to sexual content, suicidal ideation, self-harm, or sexually explicit content. The bill would also require platforms to provide recurring alerts to users reminding them that they are speaking to an AI chatbot, not a real person, and that they should take a break. It would also establish annual reporting and transparency requirements. The California bill would also let individuals sue AI companies for alleged violations, seeking injunctions, up to $1,000 per violation in damages, and attorney's fees.

If enacted, the bill would make California the first state to require AI chatbot operators to follow safety protocols and hold companies legally accountable for non-compliance, with core requirements taking effect on January 1, 2026 and additional reporting obligations beginning July 1, 2027.

FTC Launches Inquiry into AI Chatbots Acting as Companions

The Federal Trade Commission is issuing orders to seven companies that provide consumer-facing AI-powered chatbots seeking information on how these firms measure, test, and monitor potentially negative impacts of this technology on children and teens.

The FTC inquiry seeks to understand what steps, if any, companies have taken to evaluate the safety of their chatbots when acting as companions, to limit the products' use by and potential negative effects on children and teens, and to apprise users and parents of the risks associated with the products.

In case you missed it!

The Fasken Privacy and Cybersecurity group recently shared the following thought leadership, which may be of interest.

Congrats to our Fasken team!

We are pleased to announce that 11 Fasken lawyers have been recognized in the 2026 edition of The Best Lawyers in Canada" for their work in Privacy and Data Security Law.

View the full list of recognized lawyers.

Best Lawyers is a well-established peer-review publication that has identified leading legal professionals across jurisdictions and practice areas for over 30 years.

We are also proud to share that Fasken has been ranked Band 2 in the Chambers Canada 2026 Guide for Privacy and Data Protection.

Four of our lawyers – Alex Cameron, Daniel Fabiano, Soleïca Monnier, and Samantha Delechantos – have also been individually recognized for their outstanding work in this area.

View the full Chambers Canada rankings.

Chambers is a leading legal directory that ranks law firms and lawyers based on in-depth research and client feedback across practice areas and jurisdictions.

Where you will find us

Members of our Privacy and Cybersecurity group will be speaking at or attending the following events in the coming months. Keep an eye out for our team and stop by to say hi!

  • The implementation of Law 25 significantly overhauls the privacy obligations of organizations. Our colleague Soleïca Monnier, a lawyer in our Privacy and Cybersecurity group, presents an online training session on the illuxi platform, offering clear and useful insights into:
    • new compliance requirements
    • the life cycle of personal information
    • managing privacy incidents

Why participate?

Because this training meets an urgent need: teaching employees about Law 25 in a clear, timely and structured manner.

This online training, available in English and French, is supported by an LMS (Learning Management System) platform that tracks employee progress, records participation statistics and generates proof of certification.

In an environment where regulatory requirements are becoming more stringent and organizations must demonstrate compliance, this solution serves as a strategic tool to facilitate and accelerate the process.

Discover this training.

About Fasken's Privacy and Cybersecurity Group

As one of the longest-standing and leading practices in privacy and cybersecurity, our dedicated national privacy team of over 30 lawyers offers a wide range of services. From managing complex privacy issues and data breaches to advising on the EU General Data Protection Regulation and emerging legal regimes, we provide comprehensive legal advisory services and are trusted by top cyber-insurance carriers and Fortune 500 companies. Our group is recognized as a leader in the field, earning accolades such as the PICCASO 'Privacy Team of the Year' award and recognition from Chambers Canada and Best Lawyers in Canada. For more information, please visit our website.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More