In a thoughtful piece at Protocol earlier this month, Neil Chilson made a number of astute observations about the challenges of legislating technology policy. He was doing so to caution against amending Section 230 of the Communications Act. Upon closer examination, however, his observations actually buttress calls for reform.
Legislating technology policy can have harmful consequences
"[M]ost technologies," Chilson points out, "are better governed by applying simple, general principles on a case-by-case basis." Legislating, he cautions, freezes rules in place based on the state of then-current technology. And once outdated, that legislation can have unintended consequences and be difficult to fix because of the slow and often arduous legislative process.
The irony here is that Section 230 was itself a legislative attempt in 1996 to set technology policy. And, right on cue, courts' misapplication of Section 230 in today's vastly different technology environment has become counterproductive.
With Section 230, Congress sought to shield what were then largely dial-up bulletin boards from liability when they moderated content. The goal was to encourage such bulletin boards to take proactive steps to protect the public from harmful online behavior. As applied by the courts, however, Section 230 is also shielding today's very different social platforms even when they do not moderate content but instead negligently, recklessly or knowingly facilitate the illegal activity of their users.
Most businesses appropriately face liability under the common law duty of care if they unreasonably fail to curb harm to their customers and the public. Absent that threat, platforms are less likely to moderate content, not more. That's the opposite of Congress' goal and puts the public at greater risk, not less.
The current court interpretation is also denying victims access to the courtroom, including in cases involving terrorism, harassment, revenge porn, housing discrimination, distribution of child sexual abuse materials and other unlawful activity with an online nexus. Moreover, consistent with Chilson's concerns about legislating technology policy, the statutory source of the problem is hindering attempts to fix it.
A common law approach allows for flexible and evolving application
Chilson argues that rather than legislate technology policy, the better approach is usually to rely on court application of common law principles. Case law, he notes, focuses on specific problems as they arise, and can evolve to address new ones. "This focus means that useless case law usually fades out or is expressly eroded by court decisions over time. And special interests cannot easily distort case law because of its decentralized nature and its focus on specific conflicts rather than entire industries."
Again, this is more an indictment of the section 230 status quo than a reason to oppose reform.
Congress passed section 230 in response to a very specific case: Stratton Oakmont v. Prodigy. In Stratton, a New York trial court ruled that Prodigy could be held liable for defamatory user posts it was not aware of because it moderated other user posts it was aware of. This ruling created a "moderator's dilemma," in which platforms would be discouraged from moderating content to avoid liability.
Section 230(c)(2) addresses this dilemma by stating that "[n]o provider or user of an interactive computer service shall be held liable on account of … any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected." As Chilson explains, "Congress altered the evolution of case law by enacting Section 230, solving the moderator's dilemma and unleashing a wave of innovation in online platforms."
The problem is that courts have misapplied a different part of Section 230 — Section 230(c)(1) — which states that "[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." Even though this language says nothing about shielding platforms from liability for their own behavior when they unreasonably fail to prevent harm, that's how courts have applied it.
If Congress had not stepped in, a New York appeals court might have overturned the Stratton decision, other courts might have taken different approaches, and Stratton might have withered in significance. Had that been the case, we would have avoided the problematic court applications of Section 230(c)(1) and special interests such as the platforms would have been unable to distort the case law to avoid accountability. Instead, the common law would have evolved to address the dial-up problems of yesterday and the social media problems of today.
Reform, not repeal
Section 230 supporters argue that repeal would throw out the baby with the bathwater. "Section 230," Chilson observes, "has protected hundreds of companies and users from being blamed for what others do online, and freed platforms to moderate content without being sued."
One way to address that concern is to reform, rather than repeal, Section 230. Congress could preserve the Section 230(c)(2) language that solves the moderator's dilemma, but correct courts' misapplication of section 230(c)(1). It could do so by amending Section 230 to ensure it does not preclude holding platforms liable for their own misfeasance when they negligently, recklessly or knowingly fail to curb unlawful activity by their users. That would neither be holding them liable for moderating content nor be treating them like the speaker or publisher of their users' posts. And it would allow courts to continue to apply the flexible common law duty of care in a way that evolves with the changing nature of online platforms and the problems they pose.
Short of legislation, the FCC could also adopt such a construction of Section 230. After all, Section 230 resides in the Communications Act, and the Supreme Court has confirmed that Section 201(b) of that Act gives the FCC independent authority to construe provisions in the very chapter in which Section 230 resides.
This would not constitute FCC regulation of the internet, but rather binding FCC guidance that courts would use in future application of Section 230. The FCC general counsel recently explained in more detail the FCC's authority in this area, and Chairman Ajit Pai has indicated his intent to move forward with a notice of proposed rule-making seeking comment on how the FCC might construe Section 230.
Some have suggested that Pai is unlikely to be able to issue that notice, let alone a final decision, before FCC leadership changes as a result of the presidential election. But even if he doesn't, it is not inconceivable that a succeeding Democratic FCC chair might take up this cause. Indeed, concern over courts' misapplication of Section 230 to shield platforms when they unreasonably fail to curb unlawful conduct is bipartisan, with disagreement largely confined to issues of bias and misinformation.
Yet another possibility is that the Supreme Court might adopt such a narrowed construction of Section 230. Indeed, Justice Clarence Thomas recently signaled in a written decision on another matter that he believes that interpretation is the correct one, and that he might be inclined to so rule if the opportunity presents itself.
Keeping the good, fixing the bad
Chilson himself acknowledges that the language of Section 230 is not perfect and reform may be appropriate to avoid its misapplication. He also shares concern for how Section 230 hinders the ability of victims to seek justice. Most of his reticence is that reform will result in the creation of some new agency that adopts "reams" of regulation, eschewing Section 230's case-by-case approach that he says shares many of the benefits of common law.
Yet judicial application of the common law duty of care is the epitome of flexible, case-by-case adjudication, and Section 230 has been applied to short circuit that common law, not retain it. A revision or construction of Section 230 that restores the duty of care while preserving Section 230(c)(2)'s content moderation safe harbor would not erect a new agency, but instead accomplish all that Chilson lauds.
To give you the best possible experience, this site uses cookies. If you continue browsing. you accept our use of cookies. You can review our privacy policy to find out more about the cookies we use.