The United States Department of Justice recently held a workshop to discuss potential concerns about Section 230 of the Communications Act. Section 230 protects online platforms and services from defamation and other types of liability for content the platform did not itself create. It has been a source of legislative concern at least since the 2016 elections, and, while many of those concerns arise from issues surrounding the integrity of the electoral process, there are some interesting privacy-related issues as well. In its news release announcing the workshop, DOJ noted that it would focus on Section 230's "expansive interpretation by the courts, its impact on the American people and business community, and whether improvements to the law should be made."

Background

Section 230 was passed at the dawn of the modern Internet era, as part of the Telecommunications Act of 1996. An earlier court decision had held that if an online service or platform tries to moderate user comments, that effort at editorial control would subject the platform to potential liability for defamation and other harms caused by those user comments (essentially, just like a newspaper). But it was widely understood to be impossible for a platform to review and approve thousands or millions of user posts, so the only sensible response would have been for platforms with user-generated content to avoid any kind of moderation at all. That, however, would have made it impossible to develop services or platforms focused on particular topics, or even—highly relevant at the time—platforms that were "family friendly," i.e., that made an effort to block or filter out adult content, including pornographic images.

The legislative solution was Section 230, which (with some exceptions) says that an online service is not liable for (a) content that it, itself does not create, or (b) any good-faith efforts it undertakes to block offensive or otherwise inappropriate content. The courts have interpreted Section 230 immunity quite broadly. The provision is regarded as not only promoting a wide range of robust and raucous online speech, but also enabling the enormous economic growth, over the last 25 years, of online services and platforms that rely on third-party content (in one form or another) to attract users, whether as paying subscribers or as an audience to view advertisements displayed by the platform. DOJ noted that "proponents claim 230 immunity led to the flourishing of the internet and the creation of the online ecosystem we see today. Opponents, on the other hand, believe that the broad interpretation of Section 230 has prevented solutions to a variety of problems that continue to proliferate to the detriment of victims, law enforcement, and civil discourse."

Concerns about online services and platforms have evolved quite a bit since 1996, but in some ways remain the same. Two current issues in particular relate to privacy and data security, rather than to the traditional defamation and free speech concerns that animate most of the discussion surrounding Section 230.

Nonconsensual Pornography

While concerns about online pornography motivated much of the support for Section 230 back in 1996, since that time a particularly nasty form of adult content, known as nonconsensual pornography (or, colloquially, "revenge porn") has developed. This material involves intimate images or video, typically of a woman, that is posted online without her knowledge or consent, often by an ex-boyfriend. Posting such material is at least potentially tortious, but under existing law, online services or platforms that display it—even if they encourage users to submit it—are protected from liability by Section 230. Section 230 has thus, arguably, led to the emergence of websites focused on obtaining and displaying this type of material.

Companion bills in the House and the Senate would address this issue by:

  • (a) Making it a federal crime to distribute private "intimate visual depictions" without consent; and
  • (b) Eliminating Section 230 immunity for an online service or platform that “knowingly solicits, or knowingly and predominantly distributes” the material.

Public Access to Encryption

A second issue dating from at least the 1990s, but very much alive today, is strong encryption for general public use. Prior to the 1990s, limitations on computer technology generally made strong encryption unavailable to the public at large, but that changed in 1991 with the introduction of the "Pretty Good Privacy" cryptosystem, an early salvo in the "Crypto Wars" that remain ongoing today, with a recent example being the 2016 dispute between Apple and the FBI over unlocking an encrypted iPhone taken from a dead terrorist in San Bernardino, California. Law enforcement has long argued that permitting the public at large to have access to unbreakable encryption enables wrongdoers to communicate securely, thus frustrating efforts to bring them to justice, and has tried to get industry to only deploy encryption with "back doors" that would permit law enforcement, with appropriate legal authorization, to have access to encrypted content. Industry uniformly rejects these pleas, noting that an encryption system known to have a back door would be inherently insecure.

EARN IT Act

Encryption, pornography, and Section 230 have come together with the leaking of a draft bill known as the "EARN IT" Act. The bill (evidently drafted by Senator Lindsay Graham) is purportedly directed to concerns about the distribution of child pornography, and would set up a commission to develop industry "best practices" for how to prevent online child exploitation. The bill would take two additional steps, however, that raise significant concerns for platforms and content providers.

First, it would eliminate a service or platform’s Section 230 immunity from civil suits brought by victims of exploitation, unless the provider has certified that it has implemented the best practices laid out by the newly-formed commission.

Second, notwithstanding whatever compromises the commission may work out in developing consensus "best practices," the Attorney General would be empowered to modify the "best practices" as he sees fit, meaning that the extent of Section 230 immunity could change based on the Attorney General’s discretion. Apps and online services offering strongly encrypted end-to-end communication are increasingly available, and, of course, law enforcement invokes child pornographers (as well as terrorists, drug dealers, and other malefactors) as being unfairly advantaged by being able to use strong encryption to frustrate law enforcement.

Industry Concerns

Commentators quickly noted the potential concern for platforms and services that offer strongly encrypted communications tools—in the name of protecting against online child exploitation, the Attorney General could require backdoors into platforms' encrypted communications systems, with Section 230 immunity held as a hostage to ensure the platforms comply. And, lest the connection between Section 230 and strong encryption be seen as too obscure, on the very day of the Department of Justice workshop, former NSA General Counsel Stewart Baker, a well-known commentator on these issues, published an article in the prominent Lawfare blog, expressly tying encryption issues to Section 230 reform.

It's unclear whether any actual legislation on these controversial issues will advance during this election year. But it would be wise for online platforms and their representatives to monitor the issue closely.