In This Issue


Regulators Focus on Data-Driven Pricing in Response to Perceptions of Unfairness and Opaque Use of Personal Information

Using personal information to set consumer pricing has long been a practice in a wide range of industries to some degree, such as through loyalty programs. But some regulators are now criticizing the practice as harmful to consumers, especially when it is not visible to them and can lead to pricing differences between consumers that they do not understand.

Key takeaways are:

  • Data-driven pricing involves processing of consumer personal information and may—in some circumstances—run afoul of existing state privacy laws prohibiting unauthorized or undisclosed secondary uses of personal information.
  • State legislatures are beginning to introduce bills that target data-driven pricing directly.
  • Regulator scrutiny will likely increase due to data-driven pricing's impact on costs to consumers. 

Below we explain what data-driven pricing is, highlight recent regulator activity, and list next steps for those considering the use of these tools. 

Data-Driven Pricing Overview

There is no legal definition of data-driven pricing, but California Attorney General Rob Bonta described it in a press release recently as: "businesses' use of consumers' personal information to set targeted, individualized prices for products and services." Consumers buying the same product at the same time may see different prices and may not understand why. Companies have long engaged in variations of this, but it is much more powerful now with the huge amount of consumer data currently available, analyzed by artificial intelligence. 

Personal information used for data-driven pricing may include location data, browsing history, demographic information, loyalty program data, time of day, gender, and many other categories of data. This data is abundant due to the numerous channels where the data can be obtained, such as devices and online activity. 

Existing Privacy Law and Emerging Legislation May Limit Some Forms of Data-Driven Pricing

Regulators in California, Maryland, and at the FTC are among many across the U.S. that are examining data-driven pricing more closely.

California
California AG Bonta issued a press release on January 27, 2026, announcing an enforcement sweep regarding what the AG termed "surveillance pricing" that explains the potential impact on consumers of data-driven pricing in violation of the California Consumer Privacy Act (CCPA) and other existing state laws. The AG noted that, among other things, data-driven pricing may violate the CCPA's "purpose limitation principle," which limits use of personal information to purposes that are consistent with a consumer's reasonable expectations, largely governed by the business's disclosures to consumers and the context in which the personal information was collected. 

The AG added that "practices like surveillance pricing may undermine consumer trust, unfairly raise prices, and when conducted without proper disclosure or beyond reasonable expectations, may violate California law…Unless a business discloses that it uses a consumer's personal information to set prices, surveillance pricing may be invisible to the consumer, as consumers usually cannot and do not consult with each other to compare the prices they have been offered."

As part of the sweep, the AG sent a request to businesses that have a "significant" online presence in the retail, grocery, and hotel industries on how they use consumer "shopping and internet browsing history, location, demographics, inferential, or other data" to set prices.


Regulators Scrutinize Data Driven Pricing Graphic

Maryland
On April 28, 2026, Maryland Governor Wes Moore signed HB-895, which has been characterized as the first bill in the nation directly targeting data-driven pricing. It regulates the use of certain "dynamic pricing" and consumer data by brick-and-mortar grocery operations, food retailers, and third-party online delivery platforms (not just food delivery). After considerable revision before passage—including striking a reference to artificial intelligence and the use of the term "surveillance pricing"—HB-895 is now largely tied to preventing the use of personal data to set different prices for different consumers. 

Under HB-895, a food retailer or third-party delivery service provider may not:

  • Engage in dynamic pricing, which is defined as the "discriminatory practice" of using personal data to offer or set a personalized price for goods or services specific to a consumer based on the consumer's personal data; or
  • Use "protected class data" to "[o]ffer, advertise, or sell a consumer good or service to a consumer to whom the protected class data pertains …if the use of the protected class data has the effect of withholding or denying from the consumer an accommodation, an advantage, or a privilege accorded to others." "Protected class data" is information already identified in existing laws against discrimination.

HB-895 will be enforced by the Maryland AG, has no private right of action, and has a 45-day cure provision. Notably, there are numerous exclusions such as those for certain loyalty programs and consumer consent. 

Many of the practices targeted by HB-895 would already be difficult or prohibited under data minimization principles in the Maryland Online Data Privacy Act, which, for example, prohibits the collection, sharing, or use of sensitive information such as location and health data unless "strictly necessary" to provide a product or service requested by a consumer, regardless of consumer consent. 

The FTC
The FTC staff in July 2024 initiated an investigation into what it also called "surveillance pricing" and requested information from eight companies that provide algorithmic pricing tools to businesses for price targeting, consumer segmentation, and profiling. The FTC's resulting January 2025 staff report included several examples of what it appeared to regard as problematic. And the FTC press release accompanying the January 2025 report commented that "The FTC should continue to investigate surveillance pricing practices because Americans deserve to know how their private data is being used to set the prices they pay and whether firms are charging different people different prices for the same good or service." There has been no formal action since then, however. 

Next Steps

When evaluating data-driven pricing practices, consider:

  • Data-driven pricing technologies process personal information of consumers to set pricing and thus are already subject to state privacy law, even if no specific law targeting the practice exists. Don't assume that because a practice complies with a new data-driven pricing law that it is legal. 
  • Targeted activities. Controversial practices include digital shelf pricing, in particular. Other targeted activities involve loyalty programs, although those have been in place for a long time and may (as with Maryland) be carved out of legislation to the extent they are not discriminatory.
  • Targeted industries. Retail, grocery, hotels, and travel industries are high priorities. 
  • Third-party tools. Before using third-party data-driven pricing tools, understand the tool, how it uses personal information, and how it is vetted for compliance with state and federal law.
  • Stay alert for new laws and regulatory initiatives in this emerging area. There is a lot of regulator fanfare around data-driven pricing, and in response, the retail industry has been actively lobbying against or for limiting these new laws, warning against unintended consequences that harm consumers. The result could be a wave of new laws and enforcement or something much more limited.

Contact: David Rice

Back to top


Enforcement of Colorado AI Act Delayed

Colorado Attorney General Phil Weiser announced in a joint motion in the x.AI v. Weiser litigation pending in federal district court in Colorado that his office would delay enforcement of SB24-205, the Colorado Artificial Intelligence Act (the Act), which becomes effective on June 30, 2026. The federal district court considering x.AI's challenge to the Act accepted the motion and ordered that the attorney general "not initiate enforcement, including but not limited to the initiation of an investigation, for alleged violations of SB24-205 (or any legislation replacing or amending SB24-205 enacted during this legislative session) that occurred or may occur on or before 14 days after the date the Court issues a ruling on x.AI's forthcoming motion for a preliminary injunction in this case." The delay gave the Colorado legislature time to review a framework proposed by the Colorado AI Policy Work Group (the Proposal), which was tasked by Governor Jared Polis with developing a proposal to amend the Act to address concerns about its broad scope, and also allow the attorney general's office to complete a rulemaking.

The Act currently imposes numerous obligations on developers and deployers of "high-risk" artificial intelligence (AI) systems, including, among other things, requirements to provide disclosures, documentation, and notice of reasonably foreseeable risks; implement risk management policies and programs; complete impact assessments; and report incidents of algorithmic discrimination to the Colorado attorney general. The Act also requires developers and deployers of AI systems generally to make certain disclosures to consumers.

The bill would substantially modify the Act to focus on automated decision-making technology (ADMT), defined as technology that processes personal data and uses computation to generate output, including predictions, recommendations, classifications, rankings, scores, or other information that is used to make, guide, or assist a decision, judgment, or determination concerning an individual. Among other things, the bill would require developers of ADMT used to "materially influence a consequential decision" to provide a deployer of such technology with certain information. The bill in some respects would expand the definition of "consequential decision," and raise the threshold that triggers developers' and deployers' obligations by requiring the use of ADMT to "materially influence" the decision rather than just being a "substantial factor" in the decision.

Specifically, the bill would require developers of such ADMT to provide certain technical documentation, notices, and disclosures to deployers, and deployers would have to provide consumers—at the point of interaction—with certain notices about the use of ADMT for consequential decisions, as well as certain information when the use of ADMT results in an adverse outcome. Consumers would have rights to correct personal data and request meaningful human review and reconsideration of the consequential decision, among other things.  

Violations of the bill would be a deceptive trade practice under the Colorado Consumer Protection Act. The bill also provides that a developer or deployer could be held liable for unlawful discrimination under Colorado law for consequential decisions materially influenced by ADMT. Indemnification against liability for violations of the Colorado anti-discrimination law resulting from the development or deployment of ADMT would be void. The Colorado attorney general would have exclusive enforcement authority with respect to violations of the disclosure requirements and consumer rights. The attorney general would be required to provide a 60-day opportunity to cure an alleged violation, so long as a cure is deemed possible and the attorney general does not find that the developer or deployer knowingly or repeatedly violated the bill. The bill gives the attorney general both mandatory and discretionary rulemaking authority. It would take effect on January 1, 2027, unless the district court grants the motion for preliminary injunction that x.AI is scheduled to file "within 28 days after final adoption of rulemaking implementing SB24-205 or any legislation that may replace or amend SB24-205."

Contact: Nancy Libin

Back to top


House Republicans Introduce Comprehensive Federal Privacy Legislation

On April 22, 2026, Congressman Brett Guthrie (R-Ky.) and Congressman John Joyce (R-Pa.) of the House Energy and Commerce Committee introduced the Securing and Establishing Consumer Uniform Rights and Enforcement over Data Act (the SECURE Data Act or the Act) which would establish a national framework for data privacy rights and obligations, preempting the patchwork of state consumer privacy laws that has now grown to include 22 state laws. The Act was introduced the same day as House Financial Services Chairman French Hill (R-Ark.) introduced the Guidelines for Use, Access, and Responsible Disclosure of Financial Data Act (the GUARD Financial Data Act), which would update Title V of the Gramm-Leach-Bliley Act (GLBA) by imposing data minimization requirements and providing rights to consumers with respect to their nonpublic personal information.

The Act would establish a framework similar to the business-friendly state consumer privacy law in Kentucky but also has some unique features, and it would preempt not just comprehensive state consumer privacy laws but also sector-specific state laws, such as biometrics laws, and possibly certain federal laws like the Video Privacy Protection Act, which is conspicuously absent from the list of federal laws that the Act does not preempt.

The Act would use the controller-processor paradigm that all state consumer privacy laws except the California Consumer Privacy Act use. The Act would apply to any person subject to the Federal Trade Commission (FTC) Act and common carriers (notwithstanding the common carrier exemption in the FTC Act) that: (1) conduct business in the U.S. or offer a product or service to residents of the U.S.; or (2) process or sell the personal data of U.S. residents and either (a) collect and process the personal data of more than 200,000 consumers annually—excluding personal data processed solely to complete a transaction—and had $25 million or more in annual gross revenue during the preceding 12-month period or (b) collect and process the personal data of 100,000 or more consumers annually—excluding data processed solely to complete a transaction—and derive more than 25% or more of annual gross revenue from the "sale" of personal data. 

Consumers' Rights

The Act would give consumers the right to (a) confirm whether their personal data is being processed; (b) access, correct, and delete such personal data; (c) obtain a copy of personal data that the consumer has provided in a portable and readily useable format, if technically feasible; and (d) opt out of targeted ads, "sales" of personal data, and reliance on profiling to make solely automated decisions to deny healthcare, rental or leasing of housing, or employment. "Sales" would be limited to the exchange of personal data for monetary consideration only. Consumers would have the right to appeal denials of their requests.

Controllers' and Processors' Obligations

Controllers would have the usual obligations to do the following:

  • Provide a privacy notice
  • Limit the collection of personal data to what is adequate, relevant, and reasonably necessary for each purpose for which the data is processed, as disclosed to the consumer
  • Refrain from processing personal data for any purpose that is not reasonably necessary for or compatible with the disclosed purpose unless prior consent is obtained
  • Establish, implement, and maintain reasonable security safeguards
  • Refrain from processing personal data in violation of anti-discrimination laws and from discriminating against consumers who exercise their rights, except that loyalty programs would be permissible
  • Obtain consent before processing sensitive data
  • Process the personal data of children under 13 years of age in accordance with the Children's Online Privacy Protection Act
  • Comply with requests from consumers to exercise their rights

In addition, controllers would be required to provide notice and obtain verifiable consent from parents of consumers who are at least 13 but under 16 years of age before processing such consumers' personal data. And controllers would be required to disclose to consumers whether any personal data is transferred or sold to or processed or stored in China, Russia, Iran, or North Korea. Processors would be required to adhere to controllers' instructions, assist them in meeting their obligations under the Act, and enter into contracts with controllers that set forth instructions for processing and the rights and obligations of both parties.

Data Brokers

The Act would establish rules for data brokers, requiring them to publicly disclose their status as a "data broker," as well as register with and provide certain information to the Federal Trade Commission, which would be required to maintain a publicly available registry. A controller is a "data broker" if it collects and processes the personal data of a consumer who is not a customer or client of the controller—or a user, reader, subscriber of a product or service of the controller—and if the controller derives 50% or more of annual gross revenue from the sale of such personal data. 

Exceptions

The Act would provide for the usual exceptions and would exempt governmental entities, entities that act as processors to governmental entities, financial institutions regulated by GLBA, covered entities and business associates subject to the Health Insurance Portability and Accountability Act, nonprofit organizations, institutions of higher education, the National Center for Missing and Exploited Children, statutorily created entities that pay for claims related to insurance company liquidation, and registered futures associations and national securities associations. The Act also would exclude certain types of data, including employee-related data, certain health-related data subject to other laws and regulations, personal data processed in accordance with GLBA, and the Fair Credit Reporting Act.

Enforcement

Violations of the Act would be deemed violations of Section 5 of the FTC Act and would be enforceable by the FTC. The common carrier exemption would not apply for purposes of the Act, meaning that the FTC would have jurisdiction to enforce the Act against common carriers. The Act would give state attorneys general authority to enforce the Act as well, provided that no FTC action had commenced. Both the FTC and state attorneys general would be required to give controllers and processors 45 days to cure alleged violations before bringing an initial action.

Contact: Nancy Libin

Back to top


Changes to FedRAMP Incident Reporting Requirements

FedRAMP has proposed a significant overhaul of its incident reporting requirements through a new Request for Comment, RFC-0031, "Updated Incident Communications Procedures." FedRAMP has framed this as a shift toward a clear, modern, rules-based reporting framework, motivated by its observation that the existing framework has been inconsistently followed because the requirements are broad and often unclear, resulting in most CSPs rarely notifying FedRAMP of incidents. FedRAMP believes that a clear set of reporting requirements must be established to ensure cloud service providers understand and can implement incident reporting requirements to meet their ongoing FedRAMP certification requirements.

Key provisions of the proposal include: (1) narrower incident definition (borrowing from FISMA and removing the "potential" element from existing requirements); (2) tiered severity ratings and reporting deadlines (including ongoing status reports); and (3) a service availability status portal to replace availability incident notifications.

FedRAMP-authorized organizations should carefully review the proposal against their existing incident response plan and make any suggested changes to the FedRAMP Board before the comment period closes on May 12, 2026.

Contact: Andrew Lewis

Back to top