California Finalizes Rules on Automated Decisionmaking, Risk Assessments & Cybersecurity Audits

This week, the California Privacy Protection Agency (CPPA) the approval of new regulations on automated decisionmaking technology (ADMT), risk assessments and cybersecurity audits. The rulemaking package also includes regulations concerning insurance companies and revisions to existing CPPA regulations.

The rules will have widespread effects for businesses around the world by mandating, for example, documentation of risk analyses before using personal information for new high-risk uses, opt-outs and appeals of major artificial intelligence (AI)-based decisions, and annual independent security audits that are certified to the Agency. The much-anticipated final regulations come after a involving substantial industry and public input and several rounds of revisions that culminated with final approval by the CPPA Board and the California Office of Administrative Law.

Here are the highlights:

New Notice Obligations and Privacy Rights for Certain Uses of AI

What’s covered?

The regulations address AI in the form of ADMT, defined as “any technology that processes personal information and uses computation to replace or substantially replace human decisionmaking.” Importantly, the final definition carves out any ADMT decisions made with “human involvement,” where a human reviewer knows how to interpret the technology’s output, review and analyze the output and weigh it with other information, and has authority to make or change a decision based on their analysis. The new carveout aligns more closely with the European Union’s General Data Protection Regulation (GDPR) and similar UK GDPR focus on fully automated decisionmaking. It also makes the regulations narrower than the Colorado Privacy Act, which governs “human reviewed automated processing” and “human involved automated processing” in addition to “solely automated processing.”

In addition to pre-use notice and data subject rights already required by the CCPA, the new rules add further obligations to any business deploying ADMT that is used to make “significant decisions,” such as those affecting access to or provision or denial of financial or lending services, employment, housing, health care services, education, criminal justice, or essential goods or services.

What’s required?

Before using ADMT to make a significant decision, businesses must provide consumers with a prominent, plain language notice that explains (i) the ADMT’s specific purpose, (ii) the categories of personal information that could affect ADMT output, (iii) the intended outputs, (iv) the extent to which humans may be part of the decisionmaking process and (v) a description of consumer rights associated with ADMT. This pre-use notice can be included in the existing notice at collection or presented separately.

Consumers will have the right to opt-out of ADMT used for significant decisions, subject to limited exceptions, such as where the business provides and clearly describes a method consumers can use to appeal the decision to a human reviewer with authority to overturn the decision. Additional exceptions exist for decisions made solely for education admissions or acceptances, employment hiring decisions, allocation/assignment of work, or compensation decisions, provided that the ADMT works for the business’s purpose and does not unlawfully discriminate based on protected characteristics. 

Consumers will also have the right to request access to detailed information about the ADMT, which means businesses will need to provide specific purpose(s) for which the ADMT was used, information about the logic of the ADMT, the outcome of the decision-making process for the consumer and how the output affected the decision.

Important dates

Businesses will be required to comply with these requirements beginning January 1, 2027.

New Requirement to Conduct and Document Risk Assessments Before New Uses of Data That Present “Significant Risks”

What’s covered?

Businesses will be required to conduct and document risk assessments before engaging in any form of data processing that presents a “significant risk” to consumer privacy. Taking a page from European privacy law, the assessments are intended to determine whether the risks to consumer’s privacy outweigh the benefits to the consumer, the business, other stakeholders and the public. Processing that presents significant risk includes:

  • Selling or sharing personal information;
  • Processing sensitive personal information;
  • Using ADMT for a significant decision;
  • Using automated processing to infer or extrapolate various consumer traits and behavior based on (i) the consumer’s presence in sensitive locations or (ii) systematic observation of the consumer acting in their capacity as a job applicant, student, employee or independent contractor of the business;
  • Processing personal information that the business intends to use to train ADMT systems for a significant decision; and
  • Processing personal information that the business intends to use to train a facial recognition, emotional recognition, or other technology for identity verification, physical/biological identification or profiling.  

What’s required?

Risk assessments must identify:

  • The specific purpose of processing;
  • The categories of personal information subject to processing;
  • Operational elements of processing such as the method of collection and how the personal information will be disclosed;
  • Benefits to the consumer, business, other stakeholders and the public; and
  • Potential negative impacts to the consumer, as well as their sources and causes.

Businesses will also be required to identify and document in risk assessment reports:  

  • Planned safeguards to mitigate negative impacts;
  • Whether the business will initiate the processing;
  • The individuals who provided information for the assessment;
  • The date of review and approval of the assessment; and
  • The names and positions of individuals who reviewed or approved the assessment. 

Important dates

Beginning January 2028, covered businesses will be required to submit certain risk assessment materials to the CPPA’s website (and annually thereafter), including an executive attestation. Businesses must update risk assessments at least every three years or within 45 days of any material change and retain original and updated risk assessments for the longer of the processing lifecycle or five years after completion of the assessment. The CPPA and Attorney General can demand copies of completed risk assessment reports on 30-days’ notice.

New, Independent Cybersecurity Audits Must Be Performed Annually

What’s covered?

Businesses whose processing of personal information presents significant risk to consumers’ security will be required to conduct annual cybersecurity audits and document them in audit reports. The “significant risk” threshold is revenue and volume based, and applies to businesses that either (i) derive at least 50 percent annual revenues from selling/sharing personal or (ii) had more than $25 million in revenue and processed personal information of at least 250,000 consumers or sensitive personal information of at least 50,000 consumers in the preceding calendar year.

What’s required?

Audits must be performed by an internal or external qualified, objective, independent auditor following recognized cybersecurity standards. To maintain independence, businesses using internal auditors must have the highest-ranking auditor report directly to an executive who does not have direct responsibility over the business’s cybersecurity program.

Audit findings must rely on specific evidence that the auditor deems appropriate, such as documents, sampling and testing, and interviews. The business as well as its service providers and contractors must cooperate with and make available to the auditor all relevant information requested.

Audits must assess the businesses implementation of several cybersecurity program components, including:

  • Multi-factor authentication;
  • Encryption of personal information at rest and in transit;
  • Account management and access roles;
  • Asset and data inventories;
  • Secure hardware/software configurations and updates;
  • Internal and external vulnerability scanning and penetration testing;
  • Audit log management;
  • Network monitoring and defenses;
  • Antivirus and antimalware protection;
  • Network segmentation;
  • Secure code development;
  • Cybersecurity awareness, education, and training;
  • Vendor management;
  • Data retention and disposal;
  • Incident response management; and
  • Business continuity and disaster recovery plans.

The audit report must include extensive information, such as a description of the business’s information system, assessment of the cybersecurity program’s effectiveness, and gaps and remediation plans. The report must also identify the individuals responsible for the business’s cybersecurity program and the auditor and include a signed statement certifying completion.

Important dates

Audit reports must be submitted to an executive with direct responsibility for the business’s cybersecurity program and retained for five years. Businesses will also be required to certify completion to the CPPA annually.

Reporting deadlines vary based on annual revenue, but larger companies must complete their first audit reports by April 1, 2028, with an annual cadence thereafter.

Changes to Existing Opt-Out Provisions: Preference Signals and Notice Requirements

The rulemaking package also includes several updates to existing CPPA regulation requirements.

For example, businesses will soon be required to display on their websites whether they have processed a consumer’s opt-out preference signal. For example, a business may display a notice like “Opt-Out Request Honored” and show a consumer’s opt-out status through toggles or radio buttons. This change comes at a time when regulators are paying close attention to opt-out preference signals like the Global Privacy Control (GPC). Earlier this month, California, Colorado and Connecticut Attorneys General a joint investigative sweep involving potential noncompliance with the GPC.    

Additionally, the revised regulations specify that businesses must provide notice of the right to opt out of sale/sharing and the right to limit the use of sensitive information at or before the point of data collection. In cases where the business collects personal information in augmented or virtual reality, such notice can instead come before or at the time the consumer enters the AR/VR environment.

What’s Next?

The regulations take effect January 1, 2026, with phased deadlines for certain obligations beginning in 2027 and 2028. In the meantime, businesses should work with legal counsel to inventory potentially covered ADMT systems, update consumer-facing privacy notices, review vendor contracts, identify high-risk data processing activities, and develop privacy risk assessment and cybersecurity audit procedures.

Both the CPPA and the California Attorney General can bring enforcement actions under the CCPA, which can include audits, investigations and penalties for non-compliance. In the past year, both agencies have ramped up their enforcement efforts.

In March, the CPPA a settlement with a multinational car manufacturer over alleged CCPA violations concerning consumer privacy rights and vendor contracts, which marked the Agency’s first enforcement action under the CCPA. In May, the CPPA its second enforcement action—this time against a national clothing retailer—for alleged violations related to consumer privacy rights. Each settlement required the business to pay six-figure fines and implement changes to its privacy practices.

Then in July, the California Attorney General a $1.55 million settlement with a major digital health company—its fourth and largest to date. The Attorney General claimed that company used online tracking technologies and subsequently disclosed sensitive health-related information, allegedly violating the CCPA’s consent and purpose limitation requirements.

We expect regulatory enforcement to intensify both in California and among the many other states currently ramping up their privacy regulatory programs. Stay tuned for further guidance from Manatt as data protection regulations progress. More information about Manatt's Privacy and Data Security practice can be found