The California Privacy Protection Agency ("CPPA" or "Agency") is seeking preliminary comments on proposed rulemaking for risk assessments and cybersecurity audits for higher-risk data processing activities, and consumer rights related to automated decisionmaking (ADM) technologies. The CPPA is required to issue regulations on these topics under the California Privacy Rights Act ("CPRA"), which amended the California Consumer Privacy Act ("CCPA") and created the CPPA.   

The CPPA is seeking input from stakeholders in developing and proposing specific regulations that implement these CPRA amendments to the CCPA. For each topic addressed—cybersecurity audits, risk assessments, and ADM—the rulemaking notice poses specific questions for commenters and invites stakeholders to propose specific language for new regulations. The Agency issued its request for comments (the "Request for Comments") on February 10, 2023, and comments are due March 27, 2023. Below, we summarize the relevant statutory requirements and the Request for Comments on each of the three topics addressed.

Businesses likely to be covered by the CCPA—generally, for-profit entities doing business in California that: have a gross annual revenue of over $25 million; buy, sell, or share the personal information of 100,000 or more California residents or households; or derive 50 percent or more of their annual revenue from selling or sharing California consumers' personal information—are advised to follow this rulemaking closely. Once finalized, these rules could impose significant requirements on how companies design and implement their cybersecurity programs, assess and mitigate the risks of processing personal information, and deploy ADM technologies. 

Cybersecurity Audits

Under the CPRA amendments, the CPPA must issue regulations requiring covered businesses "whose processing of consumers' personal information presents significant risk to consumers' privacy or security" to perform annual cybersecurity audits. Cal. Civ. Code. §§ 1798.185(15)(A). Businesses required to conduct these cybersecurity audits must "establish a process to ensure that audits are thorough and independent." 

The CPPA's Request for Comments seeks information on other laws that require "cybersecurity audits" and the extent to which those laws' requirements align with the CCPA's. The answers to those questions are not readily apparent, as existing laws do not contain the precise term "cybersecurity audits" and that term could be understood to mean different things. In the cybersecurity space, an "audit" typically refers to an assessment of a company's security controls against an established set of requirements or standards, and frequently is conducted by an outside company or dedicated in-house function. But the term also could refer to other types of cybersecurity assessments, such as broader evaluations of a security program's overall effectiveness, or more specific assessments of security issues and vulnerabilities, such as through log review, vulnerability scanning or penetration testing.

Existing laws contain various requirements that could be understood as—or at least related to—cybersecurity audits. Those include:

  • Requirements under some state laws, such as those in Massachusetts, New York, and Oregon, to regularly assess the sufficiency of the company's security program and test the effectiveness of its key security controls;
  • Requirements under the Federal Trade Commission's Safeguards Rule, adopted pursuant to the Gramm-Leach-Bliley Act (“GLBA”), and the New York Department of Financial Services' (“NYDFS”) cybersecurity regulations to periodically review technical and physical access controls, and to regularly test the effectiveness of key safeguards, such as through continuous monitoring measures, penetration testing, and vulnerability assessments;
  • Requirements under the Health Insurance Portability and Accountability Act's ("HIPAA") Security Rule to "review and modify the security measures … as needed to continue provision of reasonable and appropriate protection of electronic protected health information," and to "[i]mplement procedures to regularly review records of information system activity, such as audit logs, access reports, and security incident tracking reports"; and
  • Proposed cybersecurity regulations issued by the Securities and Exchange Commission for investment advisors and funds that would require covered entities to "[r]eview and assess the design and effectiveness of the cybersecurity policies and procedures" and to "[p]repare a written report that, at a minimum, describes the review, the assessment, and any control tests performed, explains their results, documents any cybersecurity incident that occurred since the date of the last report, and discusses any material changes to the policies and procedures since the date of the last report" (we discussed the SEC's proposed rules here).

The FTC also recently issued an advance notice of proposed rulemaking (“ANPR”), which requested input on audits and other assessments in the context of "commercial surveillance and data security" (we discussed the ANPR here).

When developing its regulations, the CPPA will need to determine what audit criteria it will require businesses to use when conducting a "cybersecurity audit." Will businesses need to audit their programs for compliance with an established set of cybersecurity controls? Or will businesses be permitted to develop and audit their programs against their own custom-built requirements? Notably, all of the data security laws listed above generally take the latter approach. They require covered entities to conduct security-related risk assessments and then assess their security programs based on their ability to address the identified risks. As we discuss in more detail below, while the CCPA and Request for Comments refer to "risk assessments," they do so in the context of data privacy and processing—rather than specifically data security—risks. It remains to be seen whether the CPPA will follow the risk-based approach to security assessments taken by other data security laws, or will adopt a more traditionally understood "audit" requirement, whereby business will have to evaluate their compliance with specific security controls.

The Request for Comments also asks for information on non-required "cybersecurity audits, assessments or evaluations" that businesses currently perform, and on related best practices. Businesses frequently conduct "audits" and other assessments of their security programs using common industry frameworks, including:   

  • ISO/IEC 27001 and related standards for information security management;
  • The American Institute of Certified Public Accountants' Systems and Organization Controls (“SOC”) 2 – SOC for Service Organizations: Trust Services Criteria (commonly referred to as a "SOC 2" report);
  • NIST Special Publication (“SP”) 800-53, Security and Privacy Controls for Information Systems and Organizations, aimed at federal agencies and their contractors;
  • The Center for Internet Security (“CIS”) Controls, which the California attorney general previous cited as a pathway for meeting "reasonable security" requirements under California law;
  • The NIST Cybersecurity Framework (“NIST CSF”), which was developed to address critical infrastructure security but is applicable to any organization and incorporates other generally applicable standards; and
  • The HITRUST Common Security Framework (“HITRUST CSF”), which grew out of efforts to improve cybersecurity in the healthcare sector but applies across industries.

With some variation, these frameworks call for the adoption of specific security safeguards and activities. "Cybersecurity audits" using these frameworks likely would have the look and feel of more traditional "audits"—i.e., they would be assessments of whether a business has adopted enumerated security controls.

The Request for Comments also asks what "gaps or weaknesses" exist in the current cybersecurity audit requirements and practices, and how to ensure that audits are "thorough and independent."

Risk Assessments

The CPRA amendments also direct the CPPA to issue regulations requiring covered businesses "whose processing of consumers' personal information presents significant risk to consumers' privacy or security" to perform regular risk assessments "with respect to their processing of personal information." Cal. Civ. Code. §§ 1798.185(15)(B). Such risk assessments must consider "whether the processing involves sensitive personal information" and must identify and weigh "the benefits resulting from the processing to the business, the consumer, other stakeholders, and the public, against the potential risks to the rights of the consumer associated with that processing." Under the CCPA, processing of personal information should be restricted if the risks to privacy of the consumer outweigh the benefits of the processing.

The Request for Comments seeks information about other consumer privacy laws that require risk assessments, as well as the types of personal information that present significant risks to consumers' privacy or security and "identifying and weighing the benefits and risks of such processing." The Request for Comments specifically asks about the risk assessment approaches outlined in the European Data Protection Board's Guidelines on Data Protection Impact Assessment and in the Colorado Privacy Act ("CPA"), including the benefits and drawbacks of using those assessments to satisfy CCPA requirements. Like the privacy laws in Connecticut and Virginia, the CPA requires controllers to conduct data protection assessments (DPAs) of processing activities that present a heightened risk of harm to consumers, and draft rules under the CPA would require controllers to analyze what personal data is "reasonably necessary" to the processing purpose, and unique vulnerabilities, and any alternative processing activities that could be considered (we discussed the CPA's DPA requirements and other aspects of the law in a prior blog post).

The Request for Comments also asks for input on the format of risk assessments, including if the CPPA requires that risk assessments be submitted to the Agency on a regular basis, and on whether and how the requirements for both risk assessments and cybersecurity audits should be different for companies that have less than $25 million in annual gross revenue. 

Automated Decisionmaking Technology

The Request for Comments also seeks input on potential new regulation of automated decisionmaking (ADM) technologies. The CCPA does not define ADM or ADM technologies, leaving the Agency to determine how broadly such regulations may reach. There is little doubt that the CCPA's intent is to reach covered entities using AI/ML-enabled systems to make important decisions and to take other actions in many different sectors of the economy—including finance, healthcare, housing, and employment—even though the CCPA exempts from its coverage many aspects of consumers' personal finance and healthcare information. 

The CPRA amendments direct the CPPA to issue new regulations governing "access and opt-out rights with respect to businesses' use of [ADM] technology, …" The statute states that this must include "profiling" and indicates that covered entities responding to "access requests" must include "meaningful information about the logic involved in those decisionmaking processes, …" and a "description of the likely outcome of the process with respect to the consumer." Cal. Civ. Code. §§ 1798.185(16). The statutory language suggests that the Agency's work will result in rules that will impose several new obligations on businesses covered by the CCPA. First, businesses using ADM technologies will likely be required to permit individuals to "opt out" of at least some automated decisions made by such processes, such as those considered to be "profiling." An opt-out right could require human intervention or oversight of certain systems. Second, the new rules are likely to impose some new duties of transparency (explainability or interpretability) on businesses using ADM technologies to make decisions that affect individuals, which would require businesses to provide individuals with detailed information about how ADM technologies make decisions, their internal logic, and a description of "the likely outcome of the process with respect to the consumer." As we have previously explained, there is significant uncertainty about how such rights could undermine or affect AI/ML systems that are subject to these new regulations.

Notably, the Request for Comments poses a number of fundamental questions about the potential scope and reach of these new mandates. For example, the request for comments asks how the term "automated decisionmaking technology" is defined in other laws, regulations, or best practices, and whether and how such authorities may already govern the use of ADMs (see, for example, New York City's law mandating certain transparency and audit obligations for "Automated Employment Decision Tools," which we discussed here). Further, the CPPA asks commenters to explain how other authorities govern "access and opt-out rights" for ADM technologies, and whether such authorities do, or do not, address so-called "algorithmic discrimination," a question raised by other policymakers. For example, the White House's Office of Science and Technology Policy issued a "Bill of Rights" including a statement that consumers should "not face discrimination by algorithms," including impacts disfavoring people based on their "race, color, ethnicity, sex (including pregnancy, childbirth, and related medical conditions, gender identity, intersex status, and sexual orientation), religion, age, national origin, disability, veteran status, genetic information, or any other classification protected by law," and calling for "proactive equity assessments" ensuring accessibility for people with disabilities.

DWT's Privacy and Security and Artificial Intelligence teams will continue to monitor and report on the CPPA rulemaking process and on other legal and policy developments impacting Cybersecurity, Privacy, and AI.