Senators Warner, D-Va., and Fischer, R-Neb., introduced the “Deceptive Experiences to Online Users Reduction,” or DETOUR Act, on April 9, 2019. The bill covers “large online operators” (those with more than 100 million authenticated users), and addresses three main issues:

  1. Behavioral or psychological experiments on users;
  2. User interfaces (UIs) that are designed to (or in actual operation) obscure, subvert, or impair user autonomy regarding consent to privacy policies or to the provision and use of user data; and
  3. UIs designed to cultivate “compulsive usage” by users under 13.

Behavioral or Psychological Experiments or Research

The bill would ban “behavioral or psychological” experiments or research on users without the “informed consent” of participants. Experiments or research are defined as “the study, including through human experimentation, of overt or observable actions and mental phenomena inferred from behavior, including interactions between and among individuals and the activities of social groups.” “Informed consent” would require a full explanation the “benefits, risks and consequences” of participating in the experiment. Notably, language in general terms of service cannot provide informed consent, and children under 13 are deemed incapable of granting informed consent.

In addition, the bill would require large online operators to disclose to users and the public, at least quarterly, any experiments undertaken to promote “engagement or product conversion.” Each large online operator would also have to establish an “independent review board” to review and approve experiments in advance. The board would have to be able to “require modification” to or “disapprove” any proposed experiments. Each large operator would also have to submit detailed information about its review board to the Federal Trade Commission (FTC), including the names and resumes of board members, compensation they receive, conflicts of interest they may have, how the board reports to management, how it is notified of proposed experiments, and how it can veto or modify them.

Obscuring, Subverting, or Impairing User Autonomy, Decision-Making, or Choice

The bill would make it unlawful for large online operators to “design, modify or manipulate” user interfaces if the purpose or effect of the UI is to “obscur[e], subvert[], or impair[] user autonomy, decision-making, or choice to obtain consent or user data.” This provision is an effort to ban so-called “dark patterns,” i.e., the use by online services of deceptive UIs and default settings informed by behavioral research and intended to get users to agree to things (privacy policies, data disclosures, etc.) that a fully-informed, careful user would not agree to and that are advantageous to the provider of the UIs.[1]

Enforcement of these provisions is complex and multifaceted:

  • Direct FTC Enforcement. The FTC is directed to treat a violation of the statute as “a violation of a rule defining an unfair or deceptive act or practice” under 15 U.S.C. § 57a(a)(1)(B). This would permit the FTC to sue for “such relief as the court finds necessary to redress [the] injury” under 15 U.S.C. § 57b(b), without the need to follow the burdensome FTC-specific rulemaking procedures laid out in 15 U.S.C. § 57a.
  • Enforcement by a “Registered Professional Standards Body. “The bill contemplates that “an association of large online operators” will register with the FTC as “a professional standards body.” This group would be charged with developing “guidance and bright-line rules for the development and design of technology products of large online operators.” It would “enforce compliance by its members;” would be open to any large online operator; and would have to have at least one “representative of users” on its board of directors. The “bright-line rules” would include “safe harbors,” i.e., conduct that “does not have the purpose or substantial effect of subverting or impairing user autonomy, decision-making or choice.”
  • The FTC would be barred from bringing an enforcement action where a large online operator “relied in good faith” on guidance from the group.[2]

These provisions assume that users may be subject to having their ability to make choices for themselves obscured, subverted, and/or impaired by means of the design and operation of UIs. The bill would thus effectively require industry and the FTC to acknowledge and act upon the extensive body of research into behavioral economics and neuroeconomics addressing consumers’ bounded rationality and systemic (and exploitable) decision-making biases. That is, the bill would require consideration of people’s inherent cognitive biases and limitations in determining whether “consent” to an operator’s privacy terms or data collection is (in GDPR terms) “freely given, specific, informed and unambiguous.”

This basic concept will surely be controversial, and not only with industry. The FTC itself has not generally embraced behavioral economics or neuroeconomics in assessing whether business conduct is unfair or deceptive,[3] and may be institutionally reluctant to do so.

Compulsive Usage

The bill also addresses practices by large online operators that encourage “compulsive usage” by children under 13. “Compulsive usage” is defined as a “response stimulated by external factors that causes an individual to engage in repetitive, purposeful, and intentional behavior causing psychological distress, loss of control, anxiety, depression, or harmful stress responses.” Large online operators may not design or deploy a UI directed towards children “with the purpose or substantial effect of cultivating compulsive usage, including video auto-play functions initiated without the consent of a user.” The “professional standards body” noted above would be expected to identify safe harbors for user interfaces directed towards children.

FTC Rulemaking

Finally, the bill directs the FTC to conduct a rulemaking to address the following topics:

  • Rules and procedures for obtaining informed consent;
  • Rules regarding the formation and operation of the independent review boards;
  • Rules regarding the formation and operation of the professional standards bodies;
  • Development of safe harbors, i.e., “conduct that does not have the purpose or substantial effect of subverting or impairing user autonomy, decision-making, or choice, or of cultivating compulsive usage for children.”

Observations

The DETOUR Act is, at present, simply a piece of potential legislation, and there is no reason to think that it will become law any time soon. It is significant, however, because it is one of the first, if not the first, proposed legislation that would directly require industry and regulators – notably, the FTC – to look beneath the surface of “consent” by users – click-throughs, check-boxes, etc. – and require consideration of ways that the design of UIs and other aspects of online services might “obscure,” “subvert,” or “impair” the autonomy and decision-making independence of users when they supposedly grant consent to the collection and use of information gleaned from their online activities. The implications for the information-based, advertising-based online ecosystem could be profound.


[1] Of course, a UI can be “deceptive” under existing law by, for example, failing to accurately describe what information a website or app will collect or the situations in which it will be collected. See, e.g., United States v. Path, Inc., Civ. No. C-13-0448, Complaint (N.D. Cal. January 31, 2013) (app allegedly automatically uploaded and stored extensive contact information from users’ phones even though its UI suggested only one of several options would have that effect); id., Consent Decree (N.D. Cal. February 8, 2013). The DETOUR Act, however, is clearly intended to reach more broadly than “deception” as currently understood.

[2] It is rare but not unprecedented for a private body to define legal safe harbors. For example, the Communications Assistance for Law Enforcement Act, 47 U.S.C. §§ 1001 et seq. (CALEA), requires network operators to be able to support legal wiretapping, but an operator that cannot do so in a given case cannot be fined for non-compliance if its system conforms to an applicable industry-developed standard that has been approved by the Federal Communications Commission.

[3] A consumer-privacy group in the EU raised these concerns in June 2018 by filing a complaint with EU regulators based on a study called “Deceived by Design.” The claim was that certain large online entities had designed their UIs, including the sequence and presentation of required click-throughs regarding privacy policies, to subvert the intentions of users. Also in June 2018, EPIC and other US-based privacy groups filed the same study with the FTC, which has taken no public action in response to the filing.