The Colorado Attorney General's Office has published its much-anticipated proposed rules (Proposed Rules) implementing the Colorado Privacy Act (CPA), which, as we discussed in an earlier blog post, was enacted on July 7, 2021 and is scheduled to go into effect on July 1, 2023. The Proposed Rules provide significant new detail and context for a number of obligations created by the CPA and would require companies to do a great deal of work to come into compliance. Moreover, while the Proposed Rules align in many ways with the recently proposed California Consumer Privacy Act (CCPA) rules, they diverge in ways that will undermine interoperability with those rules and will require companies to make tough decisions about how to design and implement their compliance programs. For example, as discussed below, the Proposed Rules require controllers to format privacy policies in a particular way that could conflict with their current practices and with the CCPA Proposed Rules.

Despite these potential conflicts, companies will welcome the attorney general's decision to make some of the provisions in the Proposed Rules aspirational rather than mandatory, giving companies greater flexibility to implement processes that allow compliance with multiple laws.

The attorney general will accept written comments on the Proposed Rules from October 10, 2022, to February 1, 2023, and will hold a rulemaking hearing on February 1, 2023. The attorney general also will hold three stakeholder meetings in November to discuss various aspects of the Proposed Rules.

The Proposed Rules address numerous topics. In this post we focus on the elements that are unique to Colorado's approach or that may present compliance challenges for companies.

Specific Consent Requirements and Data Governance Obligations

The CPA requires a controller—defined in the CPA as an entity that "determines the purposes for and means of processing personal data"—to obtain consent from consumers before:

  • Processing a consumer's sensitive data;
  • Processing personal data of a known child (in which case the controller must obtain a parent or guardian's consent);
  • Selling personal data, processing personal data for targeted advertising or profiling consumers to make legal or other significant decisions—after a consumer has opted out of these uses; and
  • Processing personal data for purposes "that are not reasonably necessary to, or compatible with the original specified purpose" of the processing.

Consent must be a clear, affirmative action that is freely given, specific, informed and reflects a consumer's unambiguous agreement.

The Proposed Rules provide detailed guidance regarding what constitutes a “clear, affirmative action” (it cannot be “blanketed acceptance of general terms” or “pre-ticked boxes”), when consent is “freely given” (it cannot be “bundled” with other terms or made a condition of performing a contract when not necessary to provide the goods or services under the contract), “specific” (consumers must be able to separately consent to each specific purpose), and “informed” (consent disclosures must include certain information, including the names of all third parties and affiliates receiving sensitive data through “sales” or sharing).

The Proposed Rules also set forth a detailed framework for controllers to request and for consumers to provide consent to opt in to processing of personal data after the consumer has opted out of the processing for the stated purpose. Among other things, controllers' requests to consumers to opt in cannot be made through pop-up banners or any other method that "degrades or obstructs" the consumer's experience.

Unlike the other state privacy laws and the Proposed CCPA Rules, the Proposed Rules would require controllers to "refresh" consent obtained from consumers "at regular intervals based on the context and scope of the original [c]onsent, sensitivity of [p]ersonal [d]ata collected, and reasonable expectations of the [c]onsumer." Controllers would be required to refresh consent for sensitive data—including, in particular, biometric identifiers and personal data generated from photographs or audio or video recordings—at least annually.1

Finally, to implement the CPA prohibition against "dark patterns," the Proposed Rules explain that "[c]onsent choice options should be presented to [c]onsumers in a symmetrical way that does not impose unequal weight or focus on one available choice over another," and that "consent choice options should avoid the use of emotionally manipulative language or visuals to coerce or steer [c]onsumer choice," alongside additional guidelines for avoiding "dark patterns" within in interface design and/or choice architecture.

Extensive Internal Processes Needed to Comply with Consumer Rights Requests

The CPA gives consumers the right to request access to and correction and deletion of personal data that controllers have collected about them.

The Proposed Rules detail requirements for methods through which consumers may submit data rights requests. Among other requirements, the methods must not use "dark patterns" and must "be easy for consumers to execute."

Controllers will need to implement a robust data governance framework to comply with consumers' requests. For instance, to fulfill a request to correct personal data, controllers must ensure that they correct consumers' personal data "across all data flows and repositories" and "implement[] measures to ensure that the [p]ersonal [d]ata remains corrected," in addition to "instruct[ing] all [p]rocessors that maintain the [p]ersonal Data at issue to make the necessary corrections in their respective systems and ensure that the Personal Data remains corrected."

Helpfully, the Proposed Rules adopt an approach similar to the framework articulated in the CCPA Proposed Rules, requiring controllers to consider whether the personal data is more likely than not accurate, based on the totality of the circumstances. Moreover, controllers may require consumers to provide documentation if necessary to determine whether the personal data or the consumer's requested correction to the personal data is accurate.

New Category of Sensitive Data Inferences

Like the privacy laws in California, Connecticut, Virginia, and Utah, the CPA subjects "sensitive" personal data to heightened requirements; the CPA requires controllers to obtain consent before processing such data.

The Proposed Rules define a new category of data—"sensitive data inferences"—that does not appear in any other privacy law. Sensitive data inferences are "inferences made by a [c]ontroller based on [p]ersonal [d]ata, alone or in combination with other data, which indicate an individual's racial or ethnic origin; religious beliefs; mental or physical health condition or diagnosis; sex life or sexual orientation; or citizenship or citizenship status." The Proposed Rules provide an example: High-level geolocation information may not be "sensitive" data but if the data shows that an individual visited a mosque—thereby revealing or allowing someone to infer the individual's religious beliefs—such inference would be a sensitive data inference and must be treated as "sensitive data." This approach is consistent with recent trends under the GDPR.2

Controllers may process sensitive data inferences from consumers who are over the age of 13 without consent only if:

  • (1) The processing purpose would be obvious to a reasonable consumer based on context;
  • (2) The controller permanently deletes the sensitive data inferences within 12 hours of collection or completion of the processing activity, whichever comes first;
  • (3) The controller does not sell, share, or transfer the sensitive data inferences to any processors, affiliates, or third parties; and
  • (4) The controller processes the sensitive data inferences only for the purpose disclosed to the consumer.

Controllers who rely on this limited exception must include in their privacy policies both a description of the sensitive data inferences subject to the exception and the retention and deletion timeline for the sensitive data inferences.

New Obligations Related to Privacy Notices

Like other state privacy laws, the CPA requires controllers to provide consumers with a privacy policy that includes certain information about personal data that the controller collects, uses, maintains, and discloses.

Unlike other state laws, however, the Proposed Rules require controllers to organize their privacy policies by "processing purpose" and to provide for each processing purpose the categories of personal data (including sensitive data) to be processed, the categories of personal data sold or shared with third parties, and the categories of third parties to whom the controller sells, or with whom the controller shares, personal data.

Each processing purpose must be "described in a sufficiently unambiguous, specific, and clear manner, such that the way [p]ersonal [d]ata is processed is understood by and predictable to the average [c]onsumer, the [c]ontroller, [t]hird [p]arties, and enforcement authorities." In addition, a controller must describe any sensitive data inferences that it will delete within 12 hours under the rule that allows processing of such inferences without consent. Controllers will need to either create a new Colorado-specific section of their privacy policies or perhaps reformat their existing policies to address each processing activity separately.

The Proposed Rules also diverge from current laws that govern when companies must provide notice regarding changes to their privacy policies. The Federal Trade Commission has long required companies that make "material" changes to their privacy policies to provide notice of such changes to consumers and to obtain consent before applying such material changes retroactively to previously collected personal data. Colorado's Proposed Rules expand this obligation by requiring controllers to notify consumers of "substantive" or material changes, which would include changes to the categories of personal data processed, the processing purposes, the identity of the controller, or the methods by which consumers can exercise their rights. The Proposed Rules do not explain, however, what types of changes would be "substantive" but not also "material."

Technical Specifications Required for Universal Opt-Out Mechanisms

The CPA requires controllers to honor opt-out requests made through universal opt-out mechanisms (UOOMs). This requirement becomes effective on July 1, 2024, a year after the other Proposed Rules.

The Proposed Rules provide the basic technical specification for UOOMs and state that mechanisms "must allow for [c]onsumers to automatically communicate their opt-out choice with multiple [c]ontrollers." Devices that are pre-installed with UOOMs must not be configured to opt out by default as this would not reflect a consumer's freely given choice. However, if a consumer adopts a tool that is not pre-installed on a device but is specifically marketed as a privacy control, controllers must treat the consumer's use of the tool with the UOOM as an affirmative, freely given, and unambiguous choice to opt out.

The Proposed Rules also provide guidance to "platform[s], developer[s], [and] provider[s]" of UOOMs, such as browser manufacturers or browser plug-in developers, and require them to ensure that the design of their mechanisms satisfies all of the CPA's requirements. Among other things, developers of UOOMs must ensure that controllers can determine whether a consumer is a Colorado resident and that the UOOM is sending a legitimate opt-out request.

The attorney general's office will maintain and periodically update a public list of UOOMs that have been recognized as meeting the CPA's standards and release the initial list no later than April 1, 2024. Such mechanisms may operate through a means other than by sending an opt-out signal, and could maintain a "do not sell" list that could be accessed through automated means and that controllers would be required to consult. In this respect, the Proposed Rules go further than the CCPA Proposed Rules, which do not provide any guidance regarding technical specifications for the global opt-out mechanism mandated in those rules other than requiring businesses to honor signals that are commonly used and recognized.

Tiered Approach to Profiling

The CPA defines profiling as "any form of automated processing of personal data to evaluate, analyze, or predict personal aspects concerning an identified or identifiable individual's economic situation, health, personal preferences, interests, reliability, behavior, location, or movements," and requires controllers to do the following with respect to profiling activities:

  • Conduct a data protection assessment, to be updated annually, if the profiling presents a reasonably foreseeable risk of (1) unfair or deceptive treatment of or unlawful disparate impact on customers, (2) financial or physical injury to customers, (3) a physical or other intrusion upon the solitude or seclusion or private affairs or concerns of consumers if the intrusion would be offensive to a reasonable person, or (4) other substantial injury to consumers.
  • List in their privacy policy whether any of their processing activities include profiling in furtherance of decisions that produce legal or similarly significant effects.
  • Provide consumers the opportunity to opt out of profiling when the profiling is done in furtherance of decisions that produce legal or similarly significant effects, including a method to exercise the right to opt out that is clearly and conspicuously set out in the privacy notice as well as in a clear, conspicuous, and readily accessible location outside of the privacy notice.
  • Obtain specific consent for profiling in furtherance of decisions that produce legal or similarly significant effects that includes detailed information relating to the processing.

The Proposed Rules break down "automated processing" into three categories: Solely Automated Processing, Human Reviewed Automated Processing (where the level of human review does not rise to the level required for Human Involved Automated Processing), and Human Involved Automated Processing. The Proposed Rules also provide an exception to the above-mentioned opt-out requirement for Human Involved Automated Processing, subject to certain disclosure requirements.

By adopting these definitions and the exception, the attorney general has adopted a "human in the loop" method of regulating artificial intelligence and algorithmic decision-making.

Practical Approach to Loyalty Programs

The CPA allows companies to offer consumers different prices, rates, or quality of goods and services in connection with participation in a "bona fide" loyalty, rewards, discount, or similar program (collectively, "loyalty programs").

The Proposed Rules take a practical approach to loyalty programs, recognizing that personal data is often essential to providing them. Accordingly, under the Proposed Rules, companies may stop providing loyalty program benefits to consumers who delete personal data that is essential to providing the program. And if a consumer refuses to consent to the processing of sensitive personal data that is necessary to provide the loyalty program, the company is not required to provide personalized loyalty program benefits to the consumer. The company must provide non-personalized loyalty program benefits to the consumer, however.

The Proposed Rules also require companies to provide a number of specific disclosures to consumers regarding the loyalty programs.

Specific Issues for Data Protection Assessments

Like the privacy laws in Connecticut and Virginia, the CPA requires controllers to conduct data protection assessments (DPAs) of processing activities that present a heightened risk of harm to consumers. The Proposed Rules require controllers to conduct and document a "genuine, thoughtful analysis" of an extensive list of issues, including, among other things, the extent to which the personal data to be processed is "adequate, relevant, and limited to what is reasonably necessary" to the processing purpose; consumers' expectations based on privacy notices, consent disclosures, and unique vulnerabilities; and the alternative processing activities that the company considered.

The DPA must involve all relevant internal parties in the analysis and must include external parties who are helpful in identifying and assessing risks of harm to consumers.

Controllers must retain DPAs in an electronic, transferable form for at least three years after the conclusion of the relevant processing activity.

Before a company modifies existing data processing activities in ways that put consumers at a heightened risks of harm, controllers must update DPAs. The Proposed Rules provide a list of factors to consider in determining whether a processing activity has changed enough to warrant a new DPA. To comply with this requirement, controllers will have to establish and implement policies and procedures to periodically review assessments of processing activities for potential harms to consumers and material changes.

The Proposed Rules reinforce language in the CPA that allows controllers to use assessments conducted under other privacy laws to comply with the CPA's DPA requirement—so long as the assessment is reasonably similar in scope and effect to what is required by the CPA. This will be helpful to companies complying with similar requirements in Virginia and Connecticut.


1  The CPA notably did not define biometric data, despite including provisions governing how such data may be collected and used. To fill this gap, the Proposed Rules define Biometric Data as "Biometric Identifiers that are used or intended to be used, singly or in combination with each other or with other [p]ersonal [d]ata, for identification purposes." Biometric Identifiers, in turn, mean "data generated by the technological processing, measurement, or analysis of an individual's biological, physical, or behavioral characteristics, including but not limited to a fingerprint, a voiceprint, eye retinas, irises, facial mapping, facial geometry, facial templates, or other unique biological, physical, or behavioral patterns or characteristics." Biometric Data, however, does not include photographs or audio or video recordings (or any data generated from such recordings or photographs) unless used for identification purposes. The requirement to refresh consent, however, applies to personal data derived from photos and audio or video recordings regardless of whether they are used to identify specific individuals.
2  Earlier this year, a Lithuanian court found that publication of information that allowed "special category" personal data (in this case, sexual orientation) to be inferred violated the requirement under GDPR Article 9 to obtain explicit consent before disclosing special category data. Specifically, the publication of the name of an individual's spouse who shared the same gender constituted an indirect disclosure of the individual's sexual orientation.