Senate Bill 6281, the "Washington Privacy Act" (the "Act") passed the Washington State Senate on February 14, 2020. A House committee approved an amended version of the Senate bill on March 2, 2020, which now moves to the House floor. This post provides a "deep-dive" on what the two versions of the bill would do, if finally enacted.

Without question, the most significant difference between the two bills is how they handle private rights of action. The Senate bill provides no such right; instead, all enforcement is to be handled by the state Attorney General. The House version states that a violation of the new law would constitute a violation of the state's consumer protection law, which would then permit individuals to sue for damages and injunctive relief. That consumer protection law, however, does not provide for automatic statutory damages. Instead, suits may proceed only if the consumer experiences "injury … in his or her business or property" which has been interpreted to exclude "personal injury." The House version would also impose stricter data minimization obligations, mainly limiting processing to that needed to provide services requested by a consumer or to undertake actions a consumer has requested.


The Act would apply to entities that conduct business in Washington and to entities that target Washington residents for their goods or services, provided that the entity either:

  • (a) Controls or processes personal data of at least 100,000 consumers per year, or
  • (b) Processes or controls personal data of at least 25,000 consumers and derives more than 50% (Senate version) or 25% (House version) of its gross revenue from the sale of personal data. The House version would exclude from the 25,000 count any "payment-only" transactions "where no data about consumers are retained."
  • (c) The Act will apply to higher education institutes and nonprofits, but not until three years after it comes into effect.

The Act would exempt certain types of information regulated under other Washington state and federal laws, including the following:

  • Washington Uniform Health Care Information Act and HIPAA
  • Fair Credit Reporting Act, the Gramm-Leach-Bliley Act
  • Driver's Privacy Protection Act of 1994
  • Washington Student User Privacy Act,
  • Federal Family Education Rights and Privacy Act
  • Farm Credit Act of 1971
  • Children's Online Privacy Protection Act.

Consumer Rights

Entities subject to the Act would be required to recognize a range of new consumer privacy rights:

  • Right to Access: Consumers would be able to confirm whether an entity (technically, a "controller") is processing personal data about the consumer, and would be able to request a copy of the personal data the business has. Pseudonymous and de-identified data are excluded from the scope of consumer rights and data requests.
  • Right to Correction: Consumers would have the right to correct inaccurate data about them, taking into account why the business had the data in the first place, and what it was doing with it.
  • Right to Deletion: Consumers would have the right to require the business to delete personal information about them.
  • Right to Data Portability: Consumers would have the right to obtain, in a portable and readily reusable format, personal data about the consumer that the consumer herself provided to the business, so that the consumer could easily transmit the data to another business.
  • Right to Opt-Out: Consumers would be able to opt-out of the processing of personal data about them for purposes of targeted advertising, personal data sales, or consumer profiling activities that are protected under the law.

The Act also provides for an appeal process for consumers to challenge a refusal to act on a data request.

The House version of the bill specifies that, for consumers subject to guardianship or conservatorship, the guardian/conservator would be entitled to exercise the consumer's rights on his or her behalf.

Scope of "Personal Data" Covered

The Act applies to personal data of individual Washington residents—called "consumers"—acting in their personal capacity; it does not cover information disclosed in a commercial or employment context. Personal data is broadly defined as "any information that is linked or reasonably linkable to an identified or identifiable natural person."

Personal data would not include de-identified data or publicly available information.

Rights and Obligations of Data Controllers and Processors

In an approach broadly similar to the European Union's General Data Protection Regulation (“GDPR”), the Act categorizes entities that process personal data into "data controllers" and "processors."(A controller "determines the purposes and means of processing" data; a "processor" actually does the processing. An entity can be a controller and a processor at the same time; which category applies depends on the circumstances.) The Act would require that controller-processor relationships be established in binding contracts that set forth the rights and obligations of each party. Controllers and processors could not contractually relieve themselves of the liabilities imposed upon them by the Act.


The obligations of controllers are organized around the principles of transparency, purpose specification, data minimization, avoiding secondary use, security, nondiscrimination, sensitive data, and nonwaiver of consumer rights. Controllers would have to:

  • Provide consumers with reasonably straightforward privacy notices that include the categories and purposes of personal data processing; a description of the sharing with, and/or sale of personal data to, third parties; and instructions for exercising consumer rights, including the right to opt-out.
  • Establish, implement, and maintain reasonable and appropriate administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data.
  • Not process personal data in violation of applicable nondiscrimination laws, and not discriminate against consumers for exercising rights granted under the law. However, a controller may offer financial incentives in connection with a "bona fide" benefits program.
    • Except as provided under the Act, obtain prior consent before processing sensitive data about the consumer, which includes (a) personal data that reveals racial or ethnic origin, religious beliefs, mental or physical health condition or diagnosis, sexual orientation, or citizenship or immigration status; (b) genetic or biometric data for purposes of uniquely identifying a natural person; (c) the personal data of a known child; or (d) specific geolocation data (accuracy to less than 1750 feet). Sensitive data is a type of personal data.
    • The House version would impose more stringent data minimization requirements: a controller’s collection of personal data would have to be limited to that "reasonably necessary to provide services requested by a consumer, to conduct an activity that a consumer has requested," or to verify consumer requests for access to or deletion of data, etc.
  • Complete data protection assessments for specific personal data processing activities, including: targeted advertising; sale of data; certain types of profiling involving sensitive data; and any time that processing presents a heightened risk of harm to consumers. The controller would have to provide such assessments to the state Attorney General upon request in the context of an investigation under the Act.

Controllers have the right to:

  • Require processors to delete or return data at the end of the engagement;
  • Audit and inspect the processor’s operations; and
  • Receive notice and an opportunity to object to subcontractor engagements.


In addition to meeting processor-specific requirements for confidentiality and security, the Act requires processors to:

  • Follow the controller's instructions
  • Assist the controller in meeting its own obligations to respond to consumer requests to exercise their rights
  • Meet security, breach notification, and data protection assessment requirements.
  • Provide controllers with the right to require processors to delete or return of data at the end of the engagement; to audit and inspect the processor's operations;
  • Provide controllers with notice and an opportunity to object to subcontractor engagements.

The Act provides that the obligations and liabilities for controllers and processors should not impair their ability to process data for exclusively internal purposes, such as for improving or repairing services, identifying and repairing technical errors, or for other internal operations that are reasonably aligned with the consumer's expectations regarding processing, subject to some limitations.

Facial Recognition

The Act includes several provisions specifically addressing facial recognition technology, which is "technology that analyzes facial features and is used for the identification, verification, or persistent tracking of consumers in still or video images." The definition covers not merely identifying a particular consumer by name, but includes determining whether the consumer's image (a "facial template" composed of a "pattern of facial features extracted from one or more images") matches an image already in a database of images, whether or not specifically identified.

Specific provisions addressing facial recognition include:

  • Facial recognition technology providers must make APIs available so that the technology can be tested for accuracy and unfair performance differences across distinct subpopulations—defined by visually detectable characteristics or other objectively determinable or self-identified protected characteristics. Providers must mitigate any identified performance differences, and provide related documentation about the technology and testing.
  • Service contracts involving a facial recognition technology provider must prohibit controllers from using the technology to unlawfully discriminate against individuals or groups of consumers.
  • When facial recognition technology is used in public spaces, controllers must provide a notice that includes information about the purpose of the deployment of the technology and where consumers can find more information about it, and obtain prior consent from consumers to use their image in the service (subject to certain exceptions).
  • Controllers must ensure meaningful human review of legally or similarly significant decisions regarding consumers that were initially made or supported by facial recognition services.
  • Controllers must not enroll consumers in facial recognition services in connection with "a bona fide loyalty, rewards, premium features, discounts, or club card" program.
  • The House version of the bill would bar controllers from enrolling consumers in a facial recognition system without consent in certain cases where the Senate bill might permit such enrollment.

These provisions on facial recognition exempt voluntary facial recognition services used to verify flight passengers pursuant to federal transportation regulations, although the Act would require prior notice to and consent from consumers, and would impose a 24 hour retention limit on images captured by airlines in connection with such services.


Under the version of the bill as passed by the Washington State Senate, the state attorney general would have exclusive authority to enforce the Act by bringing an action in the name of the state or on behalf of state residents. Sanctions could include an injunction and civil penalties of a maximum of $7,500 per violation. Penalty receipts are to be deposited in a newly-created consumer privacy account in the state treasury, which may only be appropriated and spent for purposes of the newly-established Office of Privacy and Data Protection.

The House version, by contrast, expressly states that violations of the Act constitute violations of the Washington Consumer Protection Act, which provides individuals a private right of action.

This difference between the two bills will likely be a source of significant controversy as the bill is being considered in the House. If the House version passes, then it will go back to the Senate for further consideration.


As drafted, the Act would take effect on July 31, 2021.


The Act is deliberately modeled after the GDPR—businesses that interact with EU customers will be familiar with the organization of the Act around data controllers and data processors. Even so, it differs from the GDPR in several important ways. Most obvious are the specific provisions regarding facial recognition services which simultaneously acknowledge the increasing demand for such services and the corresponding anxiety about their privacy implications and potential misuse. However, practitioners will also note that—unlike the GDPR—the Act does not require controllers to provide a legal basis for processing, and in most cases, prior consent to processing is not necessary.

The Act also differs significantly form the CCPA. First, the Act is less prescriptive regarding what, specifically controllers and processors must do to comply with the law. Like the CCPA, the Act is concerned with the sale of personal data, and grants consumers the right to opt-out of such sales. The Act defines "sale" as "the exchange of personal data for monetary or other valuable consideration by the controller to a third party"—which is much clearer and narrower than the definition of "sale" under the CCPA. The Act also does not require that entities be co-branded in order to be considered affiliates. That said, the AdTech industry will certainly note the Act's attention to the use of personal data in connection with targeted advertising, which is also an indirect target of the CCPA, regardless of whether the targeted advertising process involves a sale.

A key point of controversy is whether the final version of the Act (if one passes) will contain a private right of action. The Senate version of the bill did not include such a right, while the House version does.  

The next two weeks will likely be eventful as the legislature decides what action to take not only with respect to the Act, but also with respect to the range of other, more specific privacy bills that have been introduced this legislative session. The session ends on March 12.