For companies known for using technology and analysis of big data to disrupt markets, the pandemic has created new opportunities, both to develop products that meet demands of a stay-at-home society and to share and analyze data in ways that promote our health and security. However, where new opportunities require new uses of personal information, a company’s need to move fast to meet demand can often be at odds with the need to carefully map out data workflows to assess and address privacy concerns.

With Congress having failed thus far to pass comprehensive consumer privacy legislation, and California’s groundbreaking consumer privacy law offering individual rights only after data has been collected, companies have little legal guidance as to how to resolve this tension.

However, regulators are showing their willingness to use existing enforcement powers through unfair and deceptive trade practices acts to investigate privacy concerns of new or suddenly popular products. A number of Senators have also shown that they are paying close attention to privacy issues that arise as public health authorities' use Big Data analytics to support their efforts to contain COVID-19.

A group of Senate Democrats sent a letter to Google in March 2020 questioning the company’s practices around sharing data with government entities, including the measures taken to safeguard patient information entered into a website developed by Google’s sister company that facilitates checking symptoms and obtaining testing for coronavirus. A similar letter was sent to Apple about its virus screening application and website. In addition, the Senate Commerce Committee is holding a “paper hearing” on April 9, 2020, to examine the “recent uses of aggregate and anonymized consumer data to identify potential hotspots of coronavirus transmission and to help accelerate the development of treatments.”

This government interest should serve as a reminder to Big Tech that even in a pandemic, privacy guardrails should guide product development. Data protection assessments, where attorneys and business executives work together before a product is released to map how a product will collect and use data, are critical to meeting legal requirements in the area of consumer protection. These assessments should:

  • Evaluate the benefits of the data collection, use, and disclosure against the risk of harm to the individual;
  • Enforce data minimization during product development by documenting the necessity of each data element collected to achieving the business purpose;
  • Analyze the data collection, use, and disclosure practices against the organization’s privacy policy; and
  • Evaluate whether any special security measures need to be taken in light of the sensitivity of the data at issue.

These assessments would be valuable in demonstrating to regulators that the organization was thoughtful in considering consumer privacy and security and could actually help businesses maximize the value of the data they collect.

On the policymaking side, the current climate presents an opportunity to think more intelligently about privacy regulation. The benefits of innovative data use and sharing of data for public health purposes challenge some principles that privacy advocates have held firm to in recent debates about state and federal legislation.

For example, the California Consumer Protection Act (CCPA) prohibits the “use [of] personal information collected for additional purposes without providing the consumer with notice” (Cal. Civ. Code 1798.100(b)), and similar purpose limiting provisions are common in other privacy legislation. Purpose limitations are a tricky concept to begin with: organizations often use the same data elements in multiple different ways, all with an eye toward providing products and services, and discrete identification of each purpose is not always possible.

Moreover, providing supplemental notice is also difficult when an organization does not necessarily track an email or mailing address for each individual from whom they collect data (for example, in the case of cookie data). The CCPA—unlike the EU General Data Protection Regulation—does not contain any exceptions to its requirements or accounting for public interest use purposes.

Policymakers should consider how rigid application of privacy principles could impede important uses of data in the public interest. In the current pandemic, for instance, using big data may be critical to society’s ability to flatten the epidemiological curve and shore up the economy. We should think creatively about how we might enable uses of data that are important to the public interest while mitigating risks to privacy.

For instance, companies that use location information to track individuals’ movements could minimize the privacy risks to individuals by deleting the information within a certain period of time after pinpointing areas that are potential hot spots. And insurance companies could be barred from using such data to determine who among their insureds is a greater risk.

Data de-identification tools also have a role to play. Indeed, the pandemic highlights the need for sensible national standards governing what it means to “de-identify” data. For example, Google and other BigTech companies have been sharing what Google calls “anonymous” location data in order to allow public health officials to see where individuals are congregating despite shelter-in-place orders.

Organizations frequently use the term “anonymous,” including in privacy policies, without specifying what steps they have taken to render data unlinkable to a person and without acknowledging that sophisticated data scientists can re-identify based on very few data points.

While the Federal Trade Commission and National Institute of Standards and Technology have both published guidance on de-identification, this guidance is not law. Moreover, the CCPA imposes a different standard for de-identification—one that arguably cannot be met where a line of data refers to an individual (see Cal. Civ. Code 1798.140(h)).

An opportunity now exists for regulators, industry, and privacy advocates to come together to develop both pragmatic technical standards for removing identifying details from data sets and model policies for handling of de-identified data. Such rules could have public health benefits and also allow organizations to work with big data in new and innovative ways while still protecting individual privacy.



The facts, laws, and regulations regarding COVID-19 are developing rapidly. Since the date of publication, there may be new or additional information not referenced in this advisory. Please consult with your legal counsel for guidance.

DWT will continue to provide up-to-date insights and virtual events regarding COVID-19 concerns. Our most recent insights, as well as information about recorded and upcoming virtual events, are available at www.dwt.com/COVID-19.