- with Senior Company Executives, HR and Finance and Tax Executives
- with readers working within the Banking & Credit, Healthcare and Technology industries
California continues to drive national privacy and data governance standards, and with only three months into 2026, the new year is proving to be an active regulatory, enforcement and litigation year. The California Consumer Privacy Act (CCPA) remains one of the most consequential state privacy laws in the U.S. The state's compliance expectations are rapidly expanding, with rulemaking by the California Privacy Protection Agency (now rebranded from the "CPPA" to "CalPrivacy"!), heightened enforcement by the California attorney general (AG or California AG) and CalPrivacy, emerging litigation trends, and new statutory obligations such as the Delete Act. At the same time, organizations are accelerating their deployment of artificial intelligence (AI) tools and automated decisionmaking systems, introducing additional compliance considerations as regulators increase their scrutiny of such practices. Businesses that operate in or interact with California must evaluate these developments holistically to manage risk and maintain operational resilience.
This update highlights several areas clients and privacy professionals should be watching closely.
California Enforcement Priorities
Over the past year, the California AG remained active in its privacy enforcement. At the same time, CalPrivacy (with its separate enforcement authority over the CCPA) significantly expanded its enforcement activity. Taking what we have learned over the numerous recent settlement actions and projecting forward, key enforcement priorities for California regulators include:
- Honoring Opt-Out Requests and Global Privacy Control (GPC). Regulators continue to expect efficacy and ease in the opt-out process. Businesses should regularly review their opt-out mechanisms, including cookie-based opt-outs and recognition of the GPC signal, to ensure that they function properly. In addition, regulators expect opt-outs to be honored across devices and platforms where feasible.
- Minors' Data. Regulators across the country are aligned regarding the importance of protecting children's and teens' privacy and safety online. Enforcement, including against Jam City, PlayOn and Tilting Point Media, has shown that regulators are looking at how businesses treat minors' data and expect businesses to comply with heightened CCPA restrictions on sales and sharing of children's data. Businesses that should know that children are likely to access their services may be found to have willfully disregarded age. The California AG is expected to adopt regulations regarding age assurance and parental consent, which may provide more detailed guidance on the "willful disregard" standard.
- Sensitive Data (including location data and health data). Recent sweeps have focused on the collection, use and disclosure of sensitive location data. The California AG has emphasized that information about consumer movements can reveal other sensitive information about a consumer, such as their religion, health, etc. A recent settlement with Healthline Media LLC highlights the AG's focus on the use and disclosure of personal information that could reveal a consumer's health condition. There, the AG invoked the purpose limitation principle, arguing that the use of sensitive health information to target advertisements was not compatible with the stated purposes for collection.
- Purpose Limitation and Data Minimization. Regulators expect businesses to collect only the minimum personal information necessary to achieve the purpose for collection, including when consumers are exercising their privacy rights. Regulators are especially focused on surveillance pricing and believe that the use of personal information for surveillance pricing may violate the purpose limitation principle.
- Automated Decisionmaking. Regulators are especially focused on the rapid development of AI and the risks of disparate impact based on sensitive characteristics such as age, race and health.
- Vendor Management. Regulators rely on vendor contracts, and in some cases information provided directly from vendors, to understand how data flows through businesses. Regulators expect businesses to audit their vendors to ensure compliance with the CCPA for the entire data life cycle.
- Crackdown on Dark Patterns. Regulators are focused on manipulative interface designs that discourage consumers from exercising their rights, and have prioritized dark patterns in their enforcement actions and advisories.
- Data Broker Accountability Under the Delete Act. Enforcement sweeps focused on data broker registration will continue, and data brokers who fail to register will face penalties.
- Implementation of the Delete Request and Opt-Out Platform (DROP). CalPrivacy has emphasized DROP as a major enforcement tool going forward.
Further, with the new CCPA regulations having taken effect on Jan. 1, 2026, businesses should be prepared for inquiries regarding cybersecurity audits and risk assessments and tighter scrutiny of automated decision‑making tools.
Data Brokers
On Jan. 1, 2026, California's DROP went live. The platform allows California consumers to easily request all registered data brokers to stop selling their personal information. Data brokers must also register for DROP; beginning Aug. 1, 2026, data brokers will be required to process consumer DROP requests every 45 days.
Consumer engagement with DROP has been significant: More than 215,000 deletion requests have already gone out to all registered data brokers despite zero paid advertising to date. A broader multimodal outreach campaign is planned, especially to reach people most vulnerable to data misuse. As of now, more than 537 data brokers have registered, and the list continues to grow.
Importantly, data brokers face two key registrations:
- If they operated as a data broker in 2025, they must register as a data broker with CalPrivacy.
- A data broker must register with DROP. Even if a data broker did not operate as a data broker in 2025 (and therefore is not required to register as a data broker), the data broker must still register for DROP now.
Companies that may be considered a data broker should carefully scrutinize their practices to evaluate whether registration is required.
Beginning in August 2026, to process deletion requests, data brokers must either manually check DROP or implement the API. Afterward, data brokers will generate deletion lists based on the types of identifiers they collect (such as mobile advertising identifiers, or "MAIDs"), and process deletion requests. DROP uses hashed identifiers – both sides hash their inputs, and if they match, the data broker must delete all applicable personal information. Data brokers must maintain suppression lists to avoid re-ingesting deleted data and must pass deletion requests downstream to any partners they have shared data with. Beginning Aug. 1, 2026, the stakes get higher: Violations can cost $200 per consumer per day, plus enforcement expenses. By 2028, registered data brokers will also face mandatory third-party audits to confirm they are actually deleting data as required.
CCPA Regulations
CalPrivacy's finalized CCPA regulations governing risk assessments, cybersecurity audits and automated decision-making technology (ADMT) and more are fully effective as of Jan. 1, 2026, and create forward-looking compliance requirements and clarify several obligations for businesses.
- Opt-Out Preference Signals. Businesses must now display to users whether the business has processed their opt-out preference signal. The regulations outline that businesses must also avoid choice designs that could undermine consumer intent.
- Risk Assessments. The regulations require formal risk assessments for processing activities that create significant risk to consumers, such as selling or sharing personal information, processing sensitive personal information, and certain uses of ADMT. A compliant assessment must describe the processing, evaluate risks and safeguards, consider alternatives, and weigh benefits against potential harms.
- For a new processing activity, risk assessments must be completed prior to initiating the processing activity.
- Risk assessments conducted in 2026 and 2027 must be submitted to CalPrivacy no later than April 1, 2028. For 2028 and beyond, risk assessments must be submitted no later than April 1 of the year following when they were conducted.
- Reminder! The CCPA also applies to a business and its interactions with job applicants, employees, contractors and business-to-business contacts. It is important to confirm whether there are any applicable exemptions to conducting a risk assessment when the processing pertains to these types of data subjects.
- Cybersecurity Audits. Businesses that meet certain data volume or risk thresholds must now complete independent cybersecurity audits. The audit must be conducted by an independent auditor who is a qualified, objective and independent professional. The auditor may be internal or external to the business but must not participate in activities that compromise the auditor's independence. The audit must review administrative, technical and physical controls and document any deficiencies and remediation steps. Ultimately, the cybersecurity audit must assess specific factors related to how the business's cybersecurity program protects personal information from unauthorized access, destruction, use, modification or disclosure and protects against unauthorized activity resulting in the loss of availability of personal information. With increased enforcement attention on security practices, these audits will become a central focus of regulatory and litigation risk.
- A business must complete its first cybersecurity audit report no later than April 1, 2028, if its gross revenue was more than $100 million in 2026. However, the reports of businesses with a gross revenue of $50 million to $100 million in 2027 are due on April 1, 2029, and the reports of businesses with a gross revenue of less than $50 million in 2028 are due on April 1, 2030. Subsequent audits are due annually.
- For businesses deploying ADMT systems involving processing of consumers' personal information that presents "significant risk to consumers' privacy," the regulations impose new disclosure, access and opt‑out obligations. Businesses must explain how the ADMT works, provide meaningful access to key logic and factors, and offer consumers the ability to opt out for certain uses.
- For businesses deploying ADMT systems involving processing of consumers' personal information that presents "significant risk to consumers' privacy," the regulations impose new disclosure, access and opt‑out obligations. Businesses must explain how the ADMT works, provide meaningful access to key logic and factors, and offer consumers the ability to opt out for certain uses.
Data Breach
Effective Jan. 1, 2026, California amended its data breach notification statute (Cal. Civ. Code § 1798.82) through SB 446, marking a significant shift from flexible to fixed reporting timelines. Previously, companies were required to notify affected residents of a breach "in the most expedient time possible and without unreasonable delay," a subjective standard criticized as creating a compliance loophole. The amendment now mandates a 30-day deadline for notifying impacted California residents following discovery of a breach, subject only to limited exceptions for law enforcement needs or efforts to determine the breach's scope and restore system integrity. The law also imposes a new requirement for breaches affecting more than 500 residents: Businesses must submit a sample notice to the California AG within 15 days of notifying consumers. These changes bring California in line with several other states that have adopted fixed timelines and will require companies to update incident response procedures.
In November 2025, California, New York and Connecticut issued a $5.1 million joint settlement with Illuminate Education Inc. following a student data breach affecting more than 400,000 California students. Among the data compromised was sensitive race, disability and medical information. The enforcement relied on various statutes, including California's K-12 Pupil Online Personal Information Protection Act (KOPIPA), which requires businesses to implement "reasonable security procedures."
This action is California's first major enforcement under KOPIPA, signaling an increasingly stringent approach to student data privacy, particularly for ed tech vendors. The case underscores the state's expectation that companies handling children's data must maintain robust cybersecurity practices and provide accurate, non‑misleading disclosures about privacy protections.
AI
Privacy regulators continue to emphasize that AI considerations are not separate from privacy considerations. In fact, AI tools, just like other tools, need to be designed and evaluated with privacy in mind.
When organizations begin developing AI governance capabilities, one of the first tasks is to evaluate which regulatory frameworks might apply to their datasets, industry or use cases. That means risk-tiering systems, understanding contract requirements and mapping obligations under emerging standards such as NIST's AI Risk Management Framework. But governance must also remain flexible. Laws are fragmented across jurisdictions, and in some cases regulation is still limited or evolving. Instead of chasing every rule, businesses should build a central governing concept anchored in risk-based approaches, product safety thinking and a clear organizational mission. Governance must be rooted in an understanding of what the AI is intended to do, how it may actually behave and what obligations attach at different levels of risk.
Privacy programs offer a useful blueprint here. Many AI laws borrow directly from privacy's playbook, including impact assessments, thresholds and transparency requirements. But determining what threshold applies to a given use case is often difficult. This is where governance becomes both an operational and a strategic discipline: clarifying intent, documenting design and proactively building controls that establish trust with stakeholders.
What Businesses Need To Know
As California pushes forward with expansive regulatory initiatives and enforcement, organizations should review their privacy programs holistically (including governance, vendor oversight, data minimization, cybersecurity readiness and AI deployment practices) to ensure they remain compliant in a fast-evolving landscape.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.
[View Source]