Sen. Lindsey Graham, pictured in the Senate hallway with Sen. Maria Cantwell, has circulated a draft bill that would revoke Section 230 protections for companies that fail to stop child abuse and sexual exploitation online.

Photo: Chris Maddaloni/Roll Call via Getty Images
Section 230 under siege: A guide to all the ways the law could be gutted
Politics

Section 230 under siege: A guide to all the ways the law could be gutted

Almost everyone wants to do something about the web's most treasured law. But no one can agree on what.

Democrats and Republicans on Capitol Hill agree on one thing: Section 230 of the Communications Decency Act, which has shaped the internet as we know it, has got to change.

They just don't agree on why or how. In the past few months, members of Congress, the Trump administration, and Attorney General William Barr have all slammed the 24-year-old law, which protects online companies from liability for anything people post on their networks, and keeps them from being punished for the moderation decisions they make "in good faith." Next week, Barr's Department of Justice will hold a first-of-its-kind workshop to debate how and whether to overhaul it.

Get what matters in tech, in your inbox every morning. Sign up for Source Code.

There will be a lot to discuss. Depending on which side of the aisle politicians sit on, Section 230 is either the reason social platforms don't moderate enough, or moderate too much. Democrats blame it for why Facebook can keep false political ads or doctored videos on the site, and Republicans point to it as the reason why social media sites can silence conservative commentators in their crackdown on hate speech and fake news.

Frustrated by their inability to hold tech giants accountable, in the past few years politicians have proposed every option — from revoking Section 230 to yanking it from trade deals to chipping away at it piece by piece. That's all while tech companies and civil liberties groups try to hold the line to keep the law that built the web from being dismantled.

Here's a detailed look at all the options floating around the halls of congress ahead of next week's workshop.

The idea: Impose a 'reasonable standard of care'

In 2018, Boston University law professor Danielle Citron and Ben Wittes, editor-in-chief of Lawfare, published a paper that quickly spread on Capitol Hill. It proposed a subtle but substantive change to Section 230 that would require platforms to take "reasonable steps to prevent or address unlawful uses of its services once warned about such uses."

That, the authors write, would restore Section 230 to its original intention, which was to encourage platforms to moderate bad content, but protect them if they make mistakes. They argued that too often Section 230 has shielded sites that willingly ignore serious threats that lead to real-world harm or that exist for the purpose of, say, publishing revenge porn.

By instituting what Citron and Wittes call a "reasonable standard of care," courts could crack down on these so-called "bad samaritans," while protecting good samaritan companies that put in a good faith effort to moderate.

"It is not inevitable that society suffers these harmful consequences in exchange for a legal environment that fosters speech and innovation," they wrote. "This exchange is a choice — and it's a bad choice."

Companies like IBM back this approach. "We've sort of sidelined the opportunity for the courts to allow some sort of reasonable care to emerge on its own," says Ryan Hageman, co-director of IBM's Policy Lab.

The criticism

Critics worry that this approach would require courts to decide, on a case-by-case basis, what type of moderation is "reasonable." Even if companies ultimately prevailed in court, they'd be drawn into frequent, lengthy and costly legal battles that leave them unprotected.

"That eliminates one of the main benefits of Section 230, which is you're not taking a platform to court over every dispute someone has about user content," says Jeff Kosseff, assistant professor of cybersecurity law in the United States Naval Academy and author of a book on Section 230, "The Twenty-Six Words That Created the Internet," who will be speaking at the DOJ event in DC.

"That proposal is an easy way to say: Let's get rid of Section 230 and make it functionally unavailable," says Eric Goldman, professor of law at Santa Clara University School of Law. Goldman will also be in DC for the workshop.

The idea: Create carveouts for individual harms

In 2017, Congress passed its first amendment to Section 230 in the form of SESTA/FOSTA, a bundle of bills that made it illegal for online platforms to knowingly assist, facilitate or support sex trafficking. That one law has since led to a litany of similar bills and proposals that aim to regulate a range of other harms, from drug trafficking to child sexual exploitation to illegal rental listings.

The proposal getting the most attention is a draft bill being circulated by South Carolina Republican Sen. Lindsey Graham. The so-called EARN It Act (short for Eliminating Abusive and Rampant Neglect of Interactive Technologies) would target child abuse and sexual exploitation by setting up a commission to create best practices for addressing the problem. The attorney general would have the power to shape those recommendations, and if online companies fail to meet those standards, they can lose Section 230 immunity. Democratic Sen. Richard Blumenthal of Connecticut is also reportedly working on the bill, but did not respond to Protocol's request for comment.

The criticism

Tech companies and free expression advocates argue that any carveout to Section 230 has unintended consequences. They point to reports of sex workers being forced back out onto the streets in the aftermath of SESTA/FOSTA.

"All I've heard about have been problems. I haven't heard of a single benefit," says Carl Szabo, vice president and general counsel of NetChoice, a trade association representing Facebook, Google, Twitter and other tech companies. Szabo will be at the DOJ workshop next week.

They are particularly wary of Graham's bill. It would allow the attorney general to effectively approve the rules that grant or deny Section 230 immunity. At a time when Barr wants tech giants to create backdoors in encrypted technology, many fear the Department of Justice could give tech platforms an ultimatum, forcing them to build backdoors in exchange for immunity.

"It strikes us as deeply, deeply concerning," says Liz Woolery, deputy director of the Free Expression Project at the Center for Democracy and Technology. "The range of issues it touches on raises everything from Fourth Amendment concerns to free expression concerns."

For Kosseff, the problem with all of these carveouts is they could end up turning Section 230 into Swiss cheese. Once there are enough exceptions to immunity, he argues, "At a certain point, it doesn't make sense to have Section 230 anymore."

The idea: Expand enforcement to states

Since it was signed into law, Section 230 has included an exception for enforcement of federal crimes. (That's what led to the Department of Justice's 2018 crackdown on Backpage.com.) It does not, however, allow state attorneys general to enforce state crimes. That's by design, Kosseff says. The authors of the law didn't want online companies to have to comply with a patchwork of state laws. But the web has grown dramatically since 1996; so has the amount of criminal activity taking place on it, which Kosseff says has strained federal enforcers.

In his conversations with lawmakers, Kosseff urges them to consider allowing state attorneys general to enforce state laws that mirror federal laws.

"It's a resource issue," he says. "When you add all 50 states' attorneys general, that's more manpower."

The criticism

Goldman, of Santa Clara University, says of all of the proposals he's heard, this is the least objectionable. But it's not perfect, either. Because state attorneys general are elected, they could be more driven by politics than anything else, he says.

"They don't necessarily bring cases that are in the best interest of their constituents," he says. "They bring the ones that will generate buzz."

Goldman points to the case of Mississippi Attorney General Jim Hood, who subpoenaed Google in 2014. Shortly after, leaked emails revealed that the Motion Picture Association of America had been lobbying attorneys general to do just that, in a campaign known as Project Goliath. Google later sued Hood, accusing him of colluding with the MPAA.

"We've seen some gross abuses here," Goldman says. "The state attorneys general are just a different animal than federal prosecutors."`

The idea: Submit platforms to a political bias test

Since the early days of the 2016 election, Republicans have been consumed by the idea that tech giants are silencing conservative voices. It stems from Facebook's short-lived Trending Topics portal, which was reportedly withholding news from conservative news sources. Since then, the allegation that tech platforms are censoring the right has been repeated — without more than anecdotal evidence — as fact.

That explains the series of bills seeking to rid online platforms of this alleged political bias. One such bill sponsored by Missouri Republican Sen. Josh Hawley would require companies that meet certain size requirements to be audited by the Federal Trade Commission every two years to prove they do not "moderate information provided by other information content providers in a politically biased manner." Hawley also recently proposed overhauling the FTC, replacing its current five-member panel with a single director, who would report to the attorney general. This, Hawley argues, would allow the FTC to more effectively regulate tech giants.

In the House, Arizona Rep. Paul Gosar proposed a bill that would change the part of Section 230 that protects companies' moderation decisions regarding offensive material. Gosar's bill would make it so companies were only protected for moderating "unlawful material," not content that's otherwise objectionable.

The criticism

Neither bill has picked up a co-sponsor. Hawley's Section 230 proposal in particular was roundly criticized, including by one of the original co-authors of Section 230, former California Republican Rep. Christopher Cox. In an op-ed in The Wall Street Journal, titled, "Hawley's Bad Idea to Protect Speech," Cox called Hawley's pitch "misguided," saying the FTC is in no position to vet the political neutrality of companies with hundreds of millions or billions of users.

Goldman, meanwhile, argues that forcing private entities to publish something they don't want to publish is its own form of censorship. "The entire architecture of both bills is designed to censor internet services by forcing them to carry content they don't think is appropriate," he says.

The idea: Study the effects of amending Section 230

Not all of the proposals circling Section 230 would chip away at it. In December, Section 230's other author, Oregon Democrat Sen. Ron Wyden, co-sponsored a bill with Massachusetts Sen. Elizabeth Warren called The SAFE SEX Workers Study Act. It would order the Department of Health and Human Services to study the impact of SESTA/FOSTA on sex workers. California Democrat Rep. Ro Khanna sponsored an identical version in the House.

When he announced it, Wyden said the study would help Congress "make informed policy decisions, rather than chasing knee-jerk responses." The bill was backed by several other Democrats and a laundry list of advocacy groups. It did not, however, have a single Republican backer.

Another bill, backed by Mississippi Democrat Rep. Bennie Thompson, doesn't take on Section 230 directly, but it would create a commission to study the proliferation of terrorist content online.

The criticism

The main concern about these studies is ensuring that they're well-designed. Woolery says the Center for Democracy and Technology is watching to ensure that the studies include the proper input and aren't too broad. Woolery says that initially, for example, the Thompson bill would have studied how online platforms are used "in furtherance of domestic or international terrorism, other illegal activity that poses a homeland or national security threat … or to carry out a foreign influence campaign that poses a homeland or national security threat to the United States."

She says the CDT and other groups convinced Thompson's office to drop the "illegal activity" clause and narrowly tailor the bill to terrorism and foreign influence campaigns. They also stressed the need to include people with civil liberties backgrounds on the commission.

"In seeing its evolution," Woolery says, "we can see Congress is coming to terms with the nuances of policy-making around the internet, balancing security with privacy."

At least, that's the hope. Next week's Department of Justice workshop will test just how nuanced the debate around Section 230 really is.

Latest Stories