ARTICLE
28 April 2026

Global Data Protection Insights

KG
K&L Gates LLP

Contributor

At K&L Gates, we foster an inclusive and collaborative environment across our fully integrated global platform that enables us to diligently combine the knowledge and expertise of our lawyers and policy professionals to create teams that provide exceptional client solutions. With offices worldwide, we represent leading global corporations in every major industry, capital markets participants, and ambitious middle-market and emerging growth companies. Our lawyers also serve public sector entities, educational institutions, philanthropic organizations, and individuals. We are leaders in legal issues related to industries critical to the economies of both the developed and developing worlds—including technology, manufacturing, financial services, healthcare, energy, and more.
Our Global Data Protection Insights newsletter distills the most important regulatory, enforcement, and litigation developments from Australia, Europe, China, the United States, and beyond into one concise, practitioner-authored resource. Whether you are navigating new HIPAA cybersecurity requirements, children's privacy obligations, or cross-border data transfer rules, this newsletter gives you the clarity and context to act with confidence.
United States Privacy
K&L Gates LLP are most popular:
  • within Immigration, Transport and Law Practice Management topic(s)
  • in Europe

OVERVIEW

Staying ahead of data protection developments across multiple jurisdictions is one of the most demanding challenges facing businesses today. Our Global Data Protection Insights newsletter distills the most important regulatory, enforcement, and litigation developments from Australia, Europe, China, the United States, and beyond into one concise, practitioner-authored resource. Whether you are navigating new HIPAA cybersecurity requirements, children's privacy obligations, or cross-border data transfer rules, this newsletter gives you the clarity and context to act with confidence.

INDUSTRY FOCUS

US HEALTHCARE AND CYBERSECURITY

By: Sarah L. Carlins, Martin A. Folliard, Clarita I. Sullivan

Important changes to the United States' federal standards governing the cybersecurity of patient health information may be coming soon. In December 2024, the US Office for Civil Rights (OCR) at the Department of Health and Human Services (HHS), the agency that oversees the Health Insurance Portability and Accountability Act (HIPAA), proposed the HIPAA Security Rule to Strengthen the Cybersecurity of Electronic Protected Health Information1 (the Proposed Rule). The Proposed Rule was announced during the Biden administration. Thus far, the Trump administration has not indicated any intent to table or abandon it. Consequently, the Proposed Rule remains on track to be finalized in May 2026, with compliance dates for these new HIPAA security requirements potentially coming in late 2026.2 If finalized as anticipated, this will be the first time since 2013 that the Security Standards for the Protection of Electronic Protected Health Information (the Security Rule) will be materially updated.3

Since the Security Rule was first implemented, the cybersecurity landscape has drastically evolved. The healthcare industry, as well as many others, has suffered an influx of cybersecurity threats and attacks. OCR recognizes this reality, explaining that the Proposed Rule “seeks to strengthen cybersecurity by updating the Security Rule's standards to better address ever-increasing cybersecurity threats to the health care sector.”4

Forecasting Key Changes of the Proposed Rule

Both covered entity and business associate clients would be impacted by this Proposed Rule. If finalized as proposed, we have previewed a few of the key changes below.

First, the Proposed Rule will institute new requirements for business associates. Among other changes, business associates will have to provide annual written verification of certain requisite technical safeguards set forth in the Security Rule for their covered entity customers. Similarly, business associate subcontractors will have to provide this same verification to their business associate customers. Such verification must include an analysis by an individual with cybersecurity expertise and a certification of accuracy by an authorized individual at the business associate or subcontractor, as applicable.5

Currently, the Security Rule only requires a business associate to provide written assurances that it will implement “reasonable and appropriate safeguards”6 to protect electronic protected health information (ePHI), which is a lesser standard. The Proposed Rule introduces these and similar stricter requirements for business associates to better safeguard the ePHI they manage.

Second, the Proposed Rule will implement more stringent technical controls. In particular, all technology that accesses ePHI will be required to use multifactor authentication, and networks will have to be firewalled to better contain cyberattacks and cybersecurity threats, amongst other controls.7 Presently, multifactor authentication is not specifically required, as general authentication of an individual or entity seeking access to ePHI is acceptable.8 Moreover, firewalls are not expressly mandatory but rather serve as an example of a technical safeguard.9 The Proposed Rule's heightened technical safeguards are intended to better ensure security and safety of ePHI where possible.

Third, the Proposed Rule will require a “technology asset inventory and a network map of . . . electronic information systems and all technology assets that may affect the confidentiality, integrity, or availability of ePHI[,]” along with certain implementation elements.10 Both covered entities and business associates will have to document this inventory and network mapping on an annual basis, as well as in response to changes in their environment or operations that may affect ePHI.11 At present, the Security Rule only requires that covered entities and business associates institute a general security-management process with certain implementation components.12 The Proposed Rule's requirements for such explicit, mandatory documentation evidence OCR's heightened cybersecurity expectations.

K&L Gates' Healthcare and FDA practice will monitor any developments to the Proposed Rule as potential finalization in May 2026 approaches. K&L Gates regularly advises on HIPAA matters and welcomes any questions our covered entity and business associate clients may have on the Proposed Rule and its implications.

FEATURED ARTICLES

US CHILDREN'S PRIVACY AND AGE ASSURANCE: INSIGHT FROM THE FTC'S WORKSHOP AND STATE LEGISLATION

By: Eric F. Vicente Flores, Whitney E. McCollum

Introduction

On 28 January 2026, the Federal Trade Commission (FTC) convened a full-day public workshop on age-verification technologies, signaling that age assurance is now a front-line enforcement and policy priority.

This article synthesizes the key themes from that workshop and examines parallel legislative developments in California that are reshaping children's privacy obligations for businesses operating digital platforms and artificial intelligence (AI)-enabled services.

The FTC Workshop – Regulatory Context and Key Themes

Regulatory Context

The FTC framed the workshop as both fact-gathering and enforcement-signaling, designed to inform a future policy statement and possible amendment to the Children's Online Privacy Protection Act (COPPA) Rule. FTC Chairman Andrew Ferguson made clear that the agency intends to push COPPA “as far as we lawfully can.” COPPA currently applies to operators of websites and online services directed at children under 13, or operators with “actual knowledge” that they are collecting personal information from

children. It requires those operators to provide notice, obtain verifiable parental consent, and comply with data minimization and deletion requirements. Enacted in [year to be confirmed], converging forces are driving regulatory urgency to strengthen COPPA enforcement and expanding its reach: (1) children's pervasive online engagement on platforms not designed for minors; (2) the maturation of more sophisticated verification technologies; and (3) a fragmented patchwork of state and global requirements creating significant operational complexity for businesses.13

A Critical Distinction: “Age Verification” vs. “Age Assurance”

“Age verification” is the term commonly used when discussing age-gating concepts, but it is only a subset of the broader concept of “age assurance.” Age assurance encompasses self-declaration (generally viewed as inadequate in higher-risk contexts), inference and estimation techniques (often AI-based), and higher-assurance verification using government-issued identity documents or authoritative databases. The workshop explored a “waterfall” approach—a layered method that starts with lower-friction, privacy-friendly approaches and escalates as needed, deleting any data collected solely for age assurance at the end of each stage. Age verification technology providers advocated for “age-aware, not identity-aware” design principles and double-blind privacy-enhancing architectures that confirm a user meets a minimum age threshold without disclosing the user's identity or creating records of browsing activity.

Tradeoffs and Unresolved Tensions

The FTC has not identified any single age-assurance method as definitively superior. What's clear is the need to mitigate core privacy concerns with any method, such as the risk of creating high-value databases of identity documents or biometric data and retention of sensitive information beyond the point of necessity. On the constitutional front, 13 state age-verification laws have been enjoined, 11 are in effect but being challenged across eight circuits, and there have been zero final rulings, with the constitutionality questions outside the obscenity context characterized as open and evolving.

One of the most significant unresolved legal questions is the “circularity problem”14 embedded in COPPA's structure: COPPA requires parental consent before collecting personal information from a child, but certain age-verification methods necessarily require collecting personal information to determine whether the user is a child in the first place. The FTC acknowledged this tension and stated that it is actively exploring potential solutions, but no resolution was announced at the workshop.

California's Expanding Children's Privacy Framework

While the FTC refines its federal approach, California continues to function as the most active state-level laboratory for children's privacy regulation, with two recent legislative developments of particular significance.

SB 243: Regulating Chatbot Interactions Involving Minors

California SB 243,15 effective 1 January 2026, addresses AI-powered chatbot technology in contexts where minors may be present. The law requires covered operators to disclose clearly that a user is interacting with an automated system rather than a human. Where a minor is a reasonably foreseeable user, obligations extend to design-level requirements intended to prevent manipulative or harmful conversational patterns and to ensure that chatbot interactions do not exploit minors' developmental vulnerabilities. Businesses deploying customer-facing conversational AI accessible to minors face immediate compliance obligations, including disclosure requirements, design constraints, and potentially integration with age-assurance mechanisms.

Assembly Bill 1043: The Digital Age Assurance Act California Assembly Bill 1043, the Digital Age Assurance Act,16 signed into law in 2025 and

operative 1 January 2027, establishes age-assurance

requirements for covered online services that are “likely to be accessed by minors.” The act adopts a risk-calibrated approach rather than mandating a single technical method, incorporates data-minimization expectations, and reflects the anticipatory “likely to be accessed” paradigm that international regulators have advanced as an alternative to COPPA's “actual knowledge” trigger. Its scope is broad enough to capture social-media platforms, gaming environments, video streaming services, and online marketplaces.

Why California Matters Beyond Its Borders

California's regulatory approach to children's privacy rarely stays confined to California; the state's outsized consumer market means that compliance with California law often drives national product and policy decisions. The California Age-Appropriate Design Code Act (CAADCA), enacted in 2022 and subsequently enjoined on First Amendment grounds, illustrates the constitutional complexity of this space and the durable downstream influence such legislation can have on product design decisions. The combination of SB 243 and AB 1043 signals that California is moving toward a comprehensive framework addressing not just what data is collected from minors, but how services are designed and what technical mechanisms must be in place before a minor engages with a covered service—a shift from privacy as data governance to privacy as product design.

A Compliance Posture for an Evolving Landscape

The FTC's January 2026 workshop and California's parallel legislative developments together signal that age assurance is transitioning from a best practice to a regulatory expectation. For businesses attempting to future-proof compliance, we suggest several near-term actions:

  • Conduct an age-assurance audit to assess current mechanisms, identify gaps relative to emerging federal and state expectations, and evaluate privacy-preserving third-party solutions that minimize raw identity data collection.
  • Review post-gate design to ensure that protective defaults—including content restrictions, contact controls, and advertising limitations—actually activate for identified minor users.
  • Monitor federal and state rulemaking, including the FTC's forthcoming policy statement and possible COPPA Rule amendment, and assess California obligations under SB 243 (now in effect) and AB 1043 (operative 1 January 2027).

AUSTRALIA: AGE ASSURANCE TECHNOLOGY REACHES MATURITY

By: Cameron Abbott, Rob Pulham, Stephanie Mayhew

The Australian Government released its Final Report on the Age Assurance Technology Trial17 at the end of 2025. Its findings will underpin the coming into effect of new rules to implement the social media minimum age limit laws, required to be in place by December 10.

The Final Report's key findings are:

  1. Age assurance can be done in Australia privately, efficiently and effectively.
  2. No substantial technological limitations preventing its implementation to meet policy goals.
  3. Provider claims have been independently validated against the project's evaluation
  4. A wide range of approaches exist, but there is no one-size-fits-all solution for all contexts.
  5. We found a dynamic, innovative and evolving age assurance service sector.
  6. We found robust, appropriate and secure data handling practices.
  7. There is scope to enhance usability, risk management and system
  8. Parental control tools can be effective but may constrain children's digital participation and evolving autonomy.
  9. Systems performed broadly consistently across demographic groups, including Indigenous
  10. Systems generally align with cybersecurity best practice, but vigilance is required.
  11. Unnecessary data retention may occur in apparent anticipation of future regulatory
  12. Providers are aligning to emerging international standards around age assurance.

As noted on the eSafety Commissioner's website, there is a range of technologies available to check age, at the point of account sign up and later. It will be up to each platform to decide which methods it uses.

Social media platforms will now need to monitor the development of guidelines by the eSafety Commissioner and ensure that they take 'reasonable steps' to prevent users under 16 from having accounts on their platforms, using steps that are just and appropriate in the circumstances.

It appears these social media age limitation laws born so briskly late last year are now coming of age.

FROM THE FLOOR

IAPP UK INTENSIVE 2026 UPDATE

By: Nóirín M. McFadden, Dr. Thomas Nietsch

Nóirín M. McFadden (London) and Dr. Thomas Nietsch (Berlin) attended this year's IAPP Intensive conference in London in February.

Thomas took part in a panel session on international data transfers alongside Emma Bate, director of legal services at the Information Commissioner's Office (ICO), Matt Houlihan, Vice President, Global Affairs, Europe at Cisco, and Gabriela Mercuri, Managing Director, SCOPE Europe. The panel was well attended, and comments made from the stage about the “trauma” of dealing with the aftermath of Schrems II and constantly shifting EU data-transfer requirements were referenced at other sessions during the conference. The panel was subsequently highlighted as one of the top three ranked panels at the conference.

Nóirín hosted a lively lunchtime roundtable on privacy, online safety, and age assurance—a particularly timely topic, as the protection of children's data was a recurring theme across the conference.

The Information Commissioner, John Edwards, delivered his final speech to this conference in his opening keynote. The ICO will be undergoing a restructure as the Data Use and Access Act (DUAA) takes effect and will be steered by a board in the future. The Commissioner's address covered themes of meeting the challenge of regulating privacy in the face of complex changes in technology. He referenced the ICO's investigation into X/xAI over Grok's inappropriate image generation and its decision—issued just the day before the conference—to fine Reddit nearly £14.5 million for the unauthorized processing of children's data.

Other notable talks included a panel on handling complex Data Subject Access Requests (DSARs), with insight into the issues that organizations face with this growing trend, and the ICO's approach to DSARs. Elsewhere, a session comparing the DUAA and the European Union's proposed digital omnibus suggested that the United Kingdom's data-protection regime is holding its own as a robust and secure framework post-Brexit.

Footnotes

1 90 Fed. Reg. 898 (proposed Dec. 27, 2024) [hereinafter, Proposed Rule].

2 Id. at 900-1.

3 Id. at 899.

4 HIPAA Security Rule Notice of Proposed Rulemaking to Strengthen Cybersecurity for Electronic Protected Health Information Fact Sheet, US Department of Health and Human Services (Dec. 27, 2024), https://www.hhs.gov/hipaa/for-professionals/security/hipaa-security-rule-nprm/factsheet/index.html.

5 Proposed Rule, at 1,016.

6 Id. at 916.

7 Id. at 1,014, 1,018.

8 Id. at 926.

9 Id. at 928.

10 Id. at 937.

11 Id. at 1,013.

12 Id. at 934.

13 Andrew Ferguson, Chairman, FTC.

14 Sara Kloek, Vice President, Education and Youth Policy at Software & Information Industry Association.

15 https://leginfo.legislature.ca.gov/faces/billNavClient.xhtml?bill_id=202520260SB243

16 https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202520260AB1043

Please click here to view the full report.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

[View Source]

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More