yesNeil FriedNone
×

Get access to Protocol

I’ve already subscribed

Will be used in accordance with our Privacy Policy

Politics

Section 230 doesn't need to be repealed. We can reform it.

Section 230 supporters argue that repeal would throw out the baby with the bathwater.

Section 230 doesn't need to be repealed. We can reform it.

Alphabet CEO Sundar Pichai testifies remotely during Senate hearing on Section 230.

Photo: Michael Reynolds-Pool/Getty Images

In a thoughtful piece at Protocol earlier this month, Neil Chilson made a number of astute observations about the challenges of legislating technology policy. He was doing so to caution against amending Section 230 of the Communications Act. Upon closer examination, however, his observations actually buttress calls for reform.

Legislating technology policy can have harmful consequences

"[M]ost technologies," Chilson points out, "are better governed by applying simple, general principles on a case-by-case basis." Legislating, he cautions, freezes rules in place based on the state of then-current technology. And once outdated, that legislation can have unintended consequences and be difficult to fix because of the slow and often arduous legislative process.

The irony here is that Section 230 was itself a legislative attempt in 1996 to set technology policy. And, right on cue, courts' misapplication of Section 230 in today's vastly different technology environment has become counterproductive.

With Section 230, Congress sought to shield what were then largely dial-up bulletin boards from liability when they moderated content. The goal was to encourage such bulletin boards to take proactive steps to protect the public from harmful online behavior. As applied by the courts, however, Section 230 is also shielding today's very different social platforms even when they do not moderate content but instead negligently, recklessly or knowingly facilitate the illegal activity of their users.

Most businesses appropriately face liability under the common law duty of care if they unreasonably fail to curb harm to their customers and the public. Absent that threat, platforms are less likely to moderate content, not more. That's the opposite of Congress' goal and puts the public at greater risk, not less.

The current court interpretation is also denying victims access to the courtroom, including in cases involving terrorism, harassment, revenge porn, housing discrimination, distribution of child sexual abuse materials and other unlawful activity with an online nexus. Moreover, consistent with Chilson's concerns about legislating technology policy, the statutory source of the problem is hindering attempts to fix it.

A common law approach allows for flexible and evolving application

Chilson argues that rather than legislate technology policy, the better approach is usually to rely on court application of common law principles. Case law, he notes, focuses on specific problems as they arise, and can evolve to address new ones. "This focus means that useless case law usually fades out or is expressly eroded by court decisions over time. And special interests cannot easily distort case law because of its decentralized nature and its focus on specific conflicts rather than entire industries."

Again, this is more an indictment of the section 230 status quo than a reason to oppose reform.

Congress passed section 230 in response to a very specific case: Stratton Oakmont v. Prodigy. In Stratton, a New York trial court ruled that Prodigy could be held liable for defamatory user posts it was not aware of because it moderated other user posts it was aware of. This ruling created a "moderator's dilemma," in which platforms would be discouraged from moderating content to avoid liability.

Section 230(c)(2) addresses this dilemma by stating that "[n]o provider or user of an interactive computer service shall be held liable on account of … any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected." As Chilson explains, "Congress altered the evolution of case law by enacting Section 230, solving the moderator's dilemma and unleashing a wave of innovation in online platforms."

The problem is that courts have misapplied a different part of Section 230 — Section 230(c)(1) — which states that "[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider." Even though this language says nothing about shielding platforms from liability for their own behavior when they unreasonably fail to prevent harm, that's how courts have applied it.

If Congress had not stepped in, a New York appeals court might have overturned the Stratton decision, other courts might have taken different approaches, and Stratton might have withered in significance. Had that been the case, we would have avoided the problematic court applications of Section 230(c)(1) and special interests such as the platforms would have been unable to distort the case law to avoid accountability. Instead, the common law would have evolved to address the dial-up problems of yesterday and the social media problems of today.

Reform, not repeal

Section 230 supporters argue that repeal would throw out the baby with the bathwater. "Section 230," Chilson observes, "has protected hundreds of companies and users from being blamed for what others do online, and freed platforms to moderate content without being sued."

One way to address that concern is to reform, rather than repeal, Section 230. Congress could preserve the Section 230(c)(2) language that solves the moderator's dilemma, but correct courts' misapplication of section 230(c)(1). It could do so by amending Section 230 to ensure it does not preclude holding platforms liable for their own misfeasance when they negligently, recklessly or knowingly fail to curb unlawful activity by their users. That would neither be holding them liable for moderating content nor be treating them like the speaker or publisher of their users' posts. And it would allow courts to continue to apply the flexible common law duty of care in a way that evolves with the changing nature of online platforms and the problems they pose.

Short of legislation, the FCC could also adopt such a construction of Section 230. After all, Section 230 resides in the Communications Act, and the Supreme Court has confirmed that Section 201(b) of that Act gives the FCC independent authority to construe provisions in the very chapter in which Section 230 resides.

This would not constitute FCC regulation of the internet, but rather binding FCC guidance that courts would use in future application of Section 230. The FCC general counsel recently explained in more detail the FCC's authority in this area, and Chairman Ajit Pai has indicated his intent to move forward with a notice of proposed rule-making seeking comment on how the FCC might construe Section 230.

Some have suggested that Pai is unlikely to be able to issue that notice, let alone a final decision, before FCC leadership changes as a result of the presidential election. But even if he doesn't, it is not inconceivable that a succeeding Democratic FCC chair might take up this cause. Indeed, concern over courts' misapplication of Section 230 to shield platforms when they unreasonably fail to curb unlawful conduct is bipartisan, with disagreement largely confined to issues of bias and misinformation.

Yet another possibility is that the Supreme Court might adopt such a narrowed construction of Section 230. Indeed, Justice Clarence Thomas recently signaled in a written decision on another matter that he believes that interpretation is the correct one, and that he might be inclined to so rule if the opportunity presents itself.

Keeping the good, fixing the bad

Chilson himself acknowledges that the language of Section 230 is not perfect and reform may be appropriate to avoid its misapplication. He also shares concern for how Section 230 hinders the ability of victims to seek justice. Most of his reticence is that reform will result in the creation of some new agency that adopts "reams" of regulation, eschewing Section 230's case-by-case approach that he says shares many of the benefits of common law.

Yet judicial application of the common law duty of care is the epitome of flexible, case-by-case adjudication, and Section 230 has been applied to short circuit that common law, not retain it. A revision or construction of Section 230 that restores the duty of care while preserving Section 230(c)(2)'s content moderation safe harbor would not erect a new agency, but instead accomplish all that Chilson lauds.

Power

Yes, GameStop is a content moderation issue for Reddit

The same tools that can be used to build mass movements can be used by bad actors to manipulate the masses later on. Consider Reddit warned.

WallStreetBets' behavior may not be illegal. But that doesn't mean it's not a problem for Reddit.

Image: Omar Marques/Getty Images

The Redditors who are driving up the cost of GameStop stock just to pwn the hedge funds that bet on its demise may not be breaking the law. But this show of force by the subreddit r/WallStreetBets still represents a new and uncharted front in the evolution of content moderation on social media platforms.

In a statement to Protocol, a Reddit spokesperson said the company's site-wide policies "prohibit posting illegal content or soliciting or facilitating illegal transactions. We will review and cooperate with valid law enforcement investigations or actions as needed."

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.

The key to American economic recovery? Automation.

A manufacturing sector revitalized by technology could help President Biden secure long-term economic growth, argue Rick Lazio and Myron Moser.

Technology could play a vital role in bolstering the economy.

Photo: Hans-Peter Merten/Getty Images

The economic impact of COVID-19 will be with us for years to come.

The U.S. economy is working its way back, albeit slowly, with the economy recovering only half of the more than 22 million jobs lost in March and April due to the pandemic and operating at 82 percent capacity compared to the first quarter of 2020.

Keep Reading Show less
Rick Lazio
Rick Lazio is currently a senior vice president at alliantgroup and is a former U.S. Representative from New York. After Congress, Rick moved to the private sector working for JPMorgan Chase as a managing director and then executive vice president.

The Capitol riots scrambled FCC Republicans’ Section 230 plans. What now?

The FCC's top tech agitators have been almost silent about Big Tech's Trump bans.

The commissioners will gingerly walk a line of condemning the tech platforms without seeming like they are condoning the rhetoric that led to Trump's suspensions or the takedown of Parler.

Photo: Jonathan Newton-Pool/Getty Images

Brendan Carr, one of the Federal Communications Commission's two Republicans, spent the better part of 2020 blasting Big Tech platforms for allegedly censoring conservative speech, appearing on Fox News and right-wing podcasts to claim that social media companies exhibited bias against President Trump and the GOP more broadly.

But in the weeks since Twitter, Facebook and YouTube suspended former President Trump and removed large swaths of his supporters in the wake of the violent riot on Capitol Hill, Carr has remained largely silent about the deplatforming, except to condemn the violence. "Political violence is completely unacceptable," Carr told reporters days after the riot. "It's clear to me President Trump bears responsibility."

Keep Reading Show less
Emily Birnbaum

Emily Birnbaum ( @birnbaum_e) is a tech policy reporter with Protocol. Her coverage focuses on the U.S. government's attempts to regulate one of the most powerful industries in the world, with a focus on antitrust, privacy and politics. Previously, she worked as a tech policy reporter with The Hill after spending several months as a breaking news reporter. She is a Bethesda, Maryland native and proud Kenyon College alumna.

Protocol | China

More women are joining China's tech elite, but 'Wolf Culture' isn't going away

It turns out getting rid of misogyny in Chinese tech isn't just a numbers game.

Chinese tech companies that claim to value female empowerment may act differently behind closed doors.

Photo: Qilai Shen/Getty Images

A woman we'll call Fan had heard about the men of Alibaba before she joined its high-profile affiliate about three years ago. Some of them were "greasy," she said, to use a Chinese term often describing middle-aged men with poor boundaries. Fan tells Protocol that lewd conversations were omnipresent at team meetings and private events, and even women would feel compelled to crack off-color jokes in front of the men. Some male supervisors treated younger female colleagues like personal assistants.

Within six months, despite the cachet the lucrative job carried, Fan wanted to quit.

Keep Reading Show less
Shen Lu

Shen Lu is a Reporter with Protocol | China. She has spent six years covering China from inside and outside its borders. Previously, she was a fellow at Asia Society's ChinaFile and a Beijing-based producer for CNN. Her writing has appeared in Foreign Policy, The New York Times and POLITICO, among other publications. Shen Lu is a founding member of Chinese Storytellers, a community serving and elevating Chinese professionals in the global media industry.

Why Biden needs a National Technology Council

The U.S. government needs a more tightly coordinated approach to technology, argues Jonathan Spalter.

A coordinated effort to approach tech could help the White House navigate the future more easily.

Photo: Gage Skidmore/Flickr

The White House has a National Security Council and a National Economic Council. President-elect Joe Biden should move quickly to establish a National Technology Council.

Consumers are looking to the government to set a coherent and consistent 21st century digital policy that works for them. Millions of Americans still await public investments that will help connect their remote communities to broadband, while millions more — including many families with school-age children — still struggle to afford access.

Keep Reading Show less
Jonathan Spalter
Jonathan Spalter is the president and CEO of USTelecom – The Broadband Association.
Latest Stories