The CPPA kicked off a first round of rulemaking in May 2022 and finalized that set of rules in March of this year. At the latest California Privacy Protection Agency (CPPA) meeting, the CPRA Rules Subcommittee (Rules Subcommittee), which is responsible for drafting new regulations under the statute, provided an update on key issues it must consider as it prepares another set of new rules relating to (1) cybersecurity audits, (2) privacy risk assessments, and (3) automated decisionmaking technology. On February 10, 2023 the CPPA issued an "invitation for preliminary comments" on these issues, and although it has not formally introduced the proposed text of this next round of rulemaking, the information recently shared by the CPPA provides a strong indication of the relevant considerations for future regulations.

Cybersecurity Audits

The CPPA is tasked with issuing regulations requiring businesses engaged in high-risk processing of personal information to perform cybersecurity audits.[1] In its recent update, the Rules Subcommittee recommended the following potential thresholds that might trigger a cybersecurity audit:

  • Businesses primarily or significantly engaged in sale/sharing of personal information (e.g., data brokers), OR
  • Larger businesses. For example, businesses that meet a particular revenue threshold, AND
    • Annually process the personal information of a threshold number of consumers or households; or
    • Annually process the sensitive personal information of a threshold number of consumers; or
    • Annually process the personal information of a threshold number of consumers that the business has actual knowledge are less than 16 years of age.

In order to ensure thoroughness, the Rules Subcommittee also recommended that the audits should: (i) identify specific evidence examined, (ii) list the components of a cybersecurity program being assessed, and (iii) assess and document all applicable components of a business's cybersecurity program. Similarly, in order to ensure independence in the audits, the Rules Subcommittee recommended that the business be required to provide relevant information to an independent auditor who will determine the scope and criteria for evaluating the audit.

Privacy Risk Assessments

The CCPA empowers the CPPA to issue regulations requiring businesses whose processing of consumers' personal information presents significant risk to consumers' privacy or security to conduct privacy risk assessments.[2] In determining what constitutes "high-risk" processing that would trigger this requirement, the Rules Subcommittee recommended implementing the following thresholds for businesses engaged in:

  • Selling or sharing personal information;
  • Processing sensitive personal information, with an exception for employment data;
  • Processing personal information of individuals under the age of 16; or
  • Using automated decisionmaking technology in furtherance of a decision that results in the provision or denial of financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment or contracting opportunities or compensation, healthcare services, or access to essential goods, services, or opportunities.

The Rules Subcommittee identified the following potential elements for the risk assessments:

  • Summary and purpose of the processing;
  • Necessity of the processing to achieve the purpose;
  • Consumers' reasonable expectations regarding the purpose of processing or the purpose's compatibility with the context in which the personal information was collected;
  • Minimum personal information that is necessary to achieve the purpose;
  • Nature and scope of processing (e.g., how long the business will retain the information);
  • Benefits resulting from the processing (to the business, the consumer, other stakeholders, the public);
  • Risks to consumers' privacy associated with the processing and safeguards the business implements to address those risks;
  • Assessment of whether the risks outweigh the benefits;
  • Names and titles of individuals responsible for preparing and reviewing the risk assessment; and
  • Additional assessment requirements for automated decisionmaking technology.

Automated Decisionmaking

The CCPA empowers the CPPA to issue regulations governing the use of automated decision-making technology (ADMT), specifically with regard to consumer access and opt-out rights.[3]

In its update, the Rules Subcommittee included a "potential definition" of ADMT as "any system, software, or process—including one derived from machine-learning, statistics, or other data processing or artificial intelligence techniques—that processes personal information and uses computation as whole or part of a system to make or execute a decision or facilitate human decisionmaking."

This proposed definition is quite broad, and although the application of this definition to artificial intelligence/machine learning (AI/ML) systems will likely draw considerable attention, it is important to note that the CPPA regulations would (if adopted) regulate a much broader array of systems and technologies that do not rely on AI/ML, thereby increasing compliance burdens and costs for many businesses that operate in California.

The Rules Subcommittee recommended that the following uses of ADMT should trigger consumer access and opt-out rights:

  • Use of ADMT in furtherance of a decision that results in the provision or denial of financial or lending services, housing, insurance, education enrollment or opportunity, criminal justice, employment or contracting opportunities or compensation, healthcare services, or access to essential goods, services, or opportunities.
    • Combined with the proposed expansive definition of ADMT, this threshold is extremely inclusive and, if adopted in its current form, would capture a significant amount of economic activity in California.
  • Use of ADMT to monitor or surveil employees, independent contractors, job applicants, or students.
    • The CCPA is an outlier among US state privacy laws in that it covers employee data. If this threshold is adopted, employers in California will need to assess whether they use ADMT, as it is ultimately defined by the CPPA, and ensure that they have processes in place to respond to employee rights requests.
  • Use of ADMT to track the behavior, location, movements, or actions of consumers in publicly accessible places.
  • This threshold reflects policymakers' increasing concern about the privacy implications of using ADMT to surveil public spaces. For example, the White House draft AI Bill of Rights states that "surveillance technologies should be subject to heightened oversight that includes…scope limits to protect privacy and civil liberties." For businesses engaged in threshold activities, compliance with consumer access and opt-out requests may prove difficult given the complex interaction between ADMT and real-world environments.

The Rules Subcommittee also recommended that the CPPA discuss whether the following processing activities should trigger consumer rights:

  • Processing personal information of individuals the business knows are less than 16 years old in the business's use of ADMT.
  • Processing personal information of consumers to train ADMT.

Although these updates provide insights into the CPPA's current thinking, the Rules Subcommittee has significant issues to address before proposing draft regulations. Specifically, the CCPA instructs that when responding to consumer access requests, businesses must "include meaningful information about the logic involved in [the] decisionmaking processes, as well as a description of the likely outcome of the process with respect to the consumer." How the CPPA defines "meaningful information" will have important implications for the level of transparency and notice that businesses employing ADMT will be required to make available to consumers.

Conclusions

The CPPA's future rulemaking is likely to produce substantial controversy, specifically with its potential application to emerging artificial intelligence tools. When the rulemaking process officially begins, companies will have the opportunity to submit comments. DWT's Privacy and Security and AI Teams regularly advise clients on policy issues involving privacy and artificial intelligence and will continue to closely monitor the CPPA's regulatory initiatives as they advance through the rulemaking process.



[1] Cal. Civ. Code § 1798.185(a)(15)(A)

[2] Specifically, the CPPA is tasked with issuing regulations requiring businesses to “[s]ubmit to the California Privacy Protection Agency on a regular basis a risk assessment with respect to their processing of personal information, including whether the processing involves sensitive personal information, and identifying and weighing the benefits resulting from the processing to the business, the consumer, other stakeholders, and the public, against the potential risks to the rights of the consumer associated with that processing, with the goal of restricting or prohibiting the processing if the risks to privacy of the consumer outweigh the benefits resulting from processing to the consumer, the business, other stakeholders, and the public.” Cal. Civ. Code § 1798.185(a)(15)(B)

[3] Specifically, the CPPA is empowered to issue regulations “governing access and opt-out rights with respect to businesses’ use of automated decisionmaking technology, including profiling and requiring businesses’ response to access requests to include meaningful information about the logic involved in those decisionmaking processes, as well as a description of the likely outcome of the process with respect to the consumer.” Cal. Civ. Code §§ 1798.185(a)(16), 1798.140(z).