ARTICLE
11 December 2025

FTC's Proposed Consent Order With Illuminate Education: A Tale Of Two Enforcement Approaches

RJ
Roth Jackson

Contributor

Roth Jackson and Marashlian & Donahue’s strategic alliance delivers premier regulatory, litigation,and transactional counsel in telecommunications, privacy, and AI—guiding global technology innovators with forward-thinking strategies that anticipate risk, support growth, and navigate complex government investigations and litigation challenges.
On December 4, 2025, the Federal Trade Commission (FTC) released a proposed Consent Order resolving its investigation into Illuminate Education, Inc., a provider of K–12 educational platforms...
United States California Consumer Protection
Jonathan S. Marashlian’s articles from Roth Jackson are most popular:
  • in United States
Roth Jackson are most popular:
  • within Consumer Protection, Finance and Banking and Corporate/Commercial Law topic(s)

On December 4, 2025, the Federal Trade Commission (FTC) released a proposed Consent Order resolving its investigation into Illuminate Education, Inc., a provider of K–12 educational platforms that stores highly sensitive student data. The FTC alleges that Illuminate failed to implement even baseline information security controls, leading to a breach in which a threat actor had 13 days of unrestricted access and exfiltrated millions of student records.

The FTC action is only part of the story and comparing it to the parallel state enforcement reveals a striking divergence in regulatory appetite.

Just weeks earlier, on November 6, 2025, the Attorneys General (AGs) of California, Connecticut, and New York announced a $5.1 million settlement with Illuminate over the same incident. California alone secured $3.25 million in civil penalties for the more than 434,000 California students affected. The state orders also impose specific, prescriptive technical requirements that go well beyond what the FTC demanded.

The FTC took a light-touch, back-to-basics approach that any competent Chief Information Security Officer (CISO) would consider table stakes and did not impose a financial penalty. In stark contrast, the states extracted real dollars and imposed detailed operational mandates. The contrast is instructive for companies operating in data-rich environments and for the schools that rely on them. This divergence continues to reinforce the trend that meaningful enforcement is increasingly coming from the states, not Washington.

What the FTC Found

The proposed complaint alleges that Illuminate represented to school districts, students, and parents that it would safeguard student personal information, but failed to do so. In particular, the FTC highlights that Illuminate:

  • Stored students' personal information in plaintext in Amazon S3 buckets until at least January 2022 – an outdated security practice;
  • Failed to implement reasonable access controls for data stored in AWS;
  • Lacked effective threat detection and response capabilities;
  • Did not maintain a robust vulnerability management and patching program;
  • Improperly configured or failed to fully implement logging and monitoring tools;
  • Did not have a comprehensive incident response plan until late 2022;
  • Had no policy or procedure for inventorying and deleting student data that was no longer needed until at least March 2022;
  • Misrepresenting its practices in contracts and other materials by characterizing its practices and procedures as "designed to meet or exceed private industry best practices;" and representing that it would provide timely breach notifications, even though it waited two years to notify some districts of breaches.

The FTC alleges these failures were avoidable using "readily available and relatively low-cost security measures," and that they caused or were likely to cause substantial injury to consumers, in violation of Section 5 of the FTC Act. Specifically, the FTC asserts three counts under Section 5:

  • Unfair security practices;
  • Deceptive representations about reasonable data security; and
  • Deceptive representations about providing timely breach notifications.

The FTC's All Too Familiar Light-Touch Remedy

The breach affected 10 million students over 13 days of unrestricted data access, with Illuminate waiting two years to notify impacted school districts. Despite the severity of the described breach, the Consent Order reads less like aggressive enforcement and more like a compliance refresher. The proposed Consent Order imposes no financial penalty and essentially codifies security practices that any mature organization should already have in place. Specifically, the 10-year Consent Order requires Illuminate to:

  • Stop making misrepresentations about the privacy, security, availability, confidentiality, or integrity of covered information, or about breach notification timing (Part I);
  • Delete unnecessary data when not needed to provide contracted services or at the customer's request (Part II)
  • Create a documented data retention schedule, which must include the purposes for collection and timeframes for deletion (Part III);
  • Implementation of a comprehensive information security program addressing the security, availability, confidentiality, and integrity of covered information (Part IV);
  • Undergo independent third-party security assessments initially and then biennially for 10 years (Part V), supported by truthful disclosure of material facts to the independent third-party (Part VI);
  • Provide annual CISO certifications that the company has implemented the required program and is not aware of any uncorrected material noncompliance (Part VII);
  • Notify the FTC whenever Illuminate notifies a government entity of a breach or unauthorized exposure of (Part VIII); and
  • Comply with standard reporting, recordkeeping, and compliance monitoring provisions (Parts IX–XII).

For most organizations that claim to "take security and privacy seriously," these requirements should look like existing practice, not a new high-water mark.

The States Take a Harder Line

In stark contrast, the states have not only secured $5.1 million in penalties (California receiving $3.25 million, New York $1.7 million, and Connecticut $150,000), but they have also imposed prescriptive technical requirements. Where the FTC order speaks in generalities about a "comprehensive information security program," the state AGs' settlement requires specific controls, including:

Requirement

New York

California

Connecticut

Comprehensive info security program

Access controls/policies

✓ (plus credential audits, quarterly reviews)

✓ (plus authentication requirements)

Encryption

✓ (implied)

Network/system monitoring

✓ (real-time monitoring and alerts)

Vulnerability management

✓ (plus explicit penetration testing)

California and Connecticut also have requirements that go beyond the universal requirements listed above. California also requires:

  • Quarterly credential audits to verify that only current employees have access
  • Isolating backup databases from active databases to prevent cascading compromises; and
  • Notifying the California Department of Justice DOJ directly of breaches involving student data; and
  • Proactively remind school districts to review student data stored on their behalf, including retention and deletion obligations.

Connecticut's order goes further still, requiring:

  • Contract reviews to conform all school district agreements to state law;
  • Data inventories and data minimization;
  • Explicit penetration testing (not just "assessments");
  • A right to delete data; and
  • Vendor monitoring obligations.

First-of-Their-Kind Enforcement. California's action marks the state's first enforcement under the K–12 Pupil Online Personal Information Protection Act (KOPIPA). Connecticut's action is its first under its Student Data Privacy Law. These inaugural cases set the baseline for future enforcement.

What the Federal and State Divergence Signals

The contrast between the federal and state approaches carries important implications for industry:

  1. Don't Mistake Federal Restraint for a Free Pass. The FTC's light touch in this case may reflect the current FTC enforcement priorities, resource constraints, or a preference for letting states take the lead on specific sectors. But it does not mean the underlying conduct was acceptable; the same facts that produced a no-penalty federal Consent Order resulted in $5.1 million in state penalties and detailed injunctive relief.
  2. State AGs Are the Enforcement Engine, For Now. Companies that calibrate their compliance programs to federal enforcement risk alone are likely underestimating their exposure. State AGs, particularly in California, New York, Connecticut, and other privacy-forward jurisdictions, are increasingly willing to use sector-specific statutes (KOPIPA, SOPIPA, state student data privacy laws) to pursue cases the FTC might handle more gently.
  3. First Enforcement Cases Set the Floor. California and Connecticut used this case to establish their enforcement posture under relatively new statutes. Other states with similar student data privacy laws will be watching. These inaugural actions often define what regulators will consider "reasonable" in future cases, and the prescriptive requirements in these settlements will likely become the de facto standard.
  4. Expect Coordinated, Multi-Front Enforcement. A single data incident now routinely triggers parallel investigations by the FTC and multiple state AGs, with each resolution having its own statutory hooks, penalty structures, and remedial demands. Companies must plan for overlapping enforcement, not sequential. The Illuminate matter shows how Section 5 of the FTC Act, KOPIPA, state student data privacy laws, and state UDAP statutes can all apply to the same facts simultaneously.
  5. Regulatory Divergence Creates Compliance Complexity. When federal and state orders impose different requirements for the same incident, companies face the challenge of meeting the highest common denominator. The FTC's biennial assessment requirement is less demanding than Connecticut's annual penetration testing mandate. California's backup isolation requirement does not appear in the federal order. Companies operating nationally must build compliance programs to the most stringent standard, which, in this case, is the state standard.

What the Orders Don't Address: AI and Model Training

Equally important is what neither the federal nor state actions address. The public documents do not grapple with whether the exfiltrated student data was used to train internal or third-party machine-learning models, nor do they require Illuminate to unwind or retrain models that may have been built on improperly secured data.

That silence matters:

  • If models were trained on compromised or improperly protected data, the value of that training persists even if raw source records are deleted.
  • The deletion and minimization obligations appear focused on primary data stores, not downstream use of that data in AI systems.
  • If Illuminate has already "harvested" learning value from the data before 2022, the practical impact of deleting older records may be relatively limited from an AI standpoint.

For AI-first businesses, or any company integrating AI into customer interactions, analytics, or personalization, this case suggests that regulators at both the federal and state levels are still in the early stages of articulating remedies that address trained models, derived data, and embeddings, rather than just raw records.

What This Means for Schools

Schools are the customers in this case, not the wrongdoers, but the Illuminate matter is a wake-up call about vendor oversight, contract terms, and shared responsibility for student data protection.

  1. Vendor Representations Are Not Enough. Illuminate told school districts it followed "industry best practices" and used encryption. It didn't. Schools that relied on those representations had no way of knowing student data was sitting in plaintext in unsecured cloud storage. Due diligence questionnaires and contract warranties matter, but they are only as good as your ability to verify, or your contractual remedies when vendors fall short.
  2. Breach Notification Delays Left Schools in the Dark. The FTC found that Illuminate waited nearly two years to notify some districts representing over 380,000 students. Contracts should include specific, enforceable breach notification Vague "prompt" or "timely" language isn't protective.
  3. Schools Have Their Own Compliance Obligations. Under FERPA, schools are responsible for ensuring that vendors receiving student education records provide adequate data protection. State laws like California's SOPIPA and KOPIPA impose additional requirements on operators and, in some cases, on schools themselves. When a vendor fails, regulators and parents may ask what the school did to vet and monitor that vendor.
  4. State Settlements Create New Vendor Obligations to Schools. Unlike the FTC order, the state settlements require Illuminate to remind schools to review data retention and deletion proactively. Connecticut's order requires Illuminate to conform its contracts to state law. Schools should expect, and demand, these updated terms.
  5. Contract Provisions to Prioritize. Schools negotiating or renewing ed tech contracts should focus on: specific security standards (encryption at rest and in transit, access controls, logging); audit rights or the right to request SOC 2 reports; breach notification timelines with teeth; data deletion obligations upon contract termination; indemnification for costs arising from vendor security failures; and cyber insurance requirements.

Practical Takeaways

For Ed Tech Vendors and Data-Rich Businesses

  1. Benchmark to state standards, not federal. If you are calibrating your security program to the FTC order alone, you are likely falling short of what California, Connecticut, and New York now expect.
  2. Expect prescriptive requirements. General commitments to "reasonable security" are giving way to specific mandates, credential audits, backup isolation, penetration testing, and data inventories. Build these into your program now.
  3. Map your state law exposure. If you handle student data, health data, or children's data, identify which state-specific statutes apply, and recognize that "reasonable security" may have different contours under each.
  4. Prepare for coordinated enforcement. Build incident response plans that anticipate simultaneous federal and multi-state investigations. Budget for the possibility of state penalties even if federal enforcement is light.
  5. Align your messaging with reality. Marketing copy about "state-of-the-art" security creates exposure if your practices are ordinary. The states specifically cited Illuminate's misleading privacy policy representations.

For School Districts

  1. Inventory your ed tech vendors. Know who has student data and what data they hold.
  2. Request security documentation. SOC 2 Type II reports, penetration test summaries, or completed security questionnaires.
  3. Review contracts for state-law compliance. Connecticut's settlement requires Illuminate to conform contracts to state law and use this as leverage to update its own agreements.
  4. Demand specific breach notification timelines. The two-year delay in this case was possible in part because contracts lacked enforceable deadlines.
  5. Establish a vendor review cycle. Don't just vet at signing; reassess annually.
  6. Expect vendor outreach on data retention. California's settlement requires Illuminate to remind schools to review stored data. Be prepared to act on those reminders.

Map data flows into AI/ML Systems

  1. Benchmark Your Security Program Against the Order: If the obligations in the FTC's Illuminate Consent Order would be a "lift" for your organization, that is a red flag. These are foundational controls, not cutting-edge requirements.
  2. Tighten Retention, Deletion, and Data Mapping: Make sure you have a defensible retention schedule, that it is followed in practice, and that your systems support deletion—not only in primary databases, but across backups and logs.
  3. Map Data Flows into AI/ML Systems: Even though the FTC is not yet explicitly mandating model retraining or deletion, regulators and private litigants will increasingly ask:
    • Where did this training data come from?
    • Was it collected and secured lawfully?
    • What happens to models if the source data was tainted?
  4. Align Your Messaging with Reality: Marketing copy and policy language about "state-of-the-art," "self-learning AI," or "best-in-class security" can create exposure if your practices are ordinary or outdated. Ensure your public promises match your internal programs.
  5. Prepare for the Next Wave of AI-Focused Enforcement: This case may be a transitional moment signal that the FTC and the states are still using classic Section 5 and UDAP tools while they work out how to handle AI-specific harms and remedies. Companies that document and govern their AI lifecycle now will be better positioned when the enforcement lens shifts.

How VisionAI+ Can Help: An Integrated Legal + Consulting Ecosystem

For organizations that want to move beyond check-the-box compliance and build privacy and security programs and AI-ready governance that can withstand scrutiny from regulators, auditors, and business partners, Marashlian & Donahue, PLLC's VisionAI+ Law Group and its affiliated consulting arm, VisionAI+ Consulting Group, offer a uniquely integrated ecosystem.

  • VisionAI+ Law Group delivers end-to-end legal support for AI, privacy, and data governance, helping clients design defensible positions, negotiate contracts, and respond to regulatory inquiries, all under attorney-client privilege.
  • VisionAI+ Consulting Group builds ISO/IEC 42001 and NIST AI RMF-aligned management systems, impact assessments, and operational controls designed to work in production environments.

Together, this dual-track model allows clients to allocate resources strategically: consultants handle the operational buildout, including assessments, frameworks, and documentation, at rates structured for implementation work, while lawyers focus on legal strategy, regulatory navigation, and matters requiring privilege protection. The result is a faster path to mature AI and data governance, significant cost efficiencies, and programs designed to stand up in the boardroom, in audits, and, if necessary, in court.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.

[View Source]

Mondaq uses cookies on this website. By using our website you agree to our use of cookies as set out in our Privacy Policy.

Learn More