yesCarl SzaboNone

Get access to Protocol

I’ve already subscribed

Will be used in accordance with our Privacy Policy


Why Section 230 protects kids, and what its critics get wrong

Why Section 230 protects kids, and what its critics get wrong

Critics incorrectly claim that Section 230 protects bad online platforms from the enforcement of major crimes.

Photo: Ulrich Perrey/Getty Images

Despite current debate over harmful content online and Section 230 of the Communications Decency Act, the truth is that Section 230 is the law that makes our internet a better place. Section 230 is often blamed for all bad content and illegal activity on the internet, but under the law, any activity that's criminal offline is criminal online. In fact, Section 230 provides no shield to criminals from enforcement of state, local and federal laws, whether they commit their crimes on or off the internet.

Take the horrific example of, an online platform that enabled sex trafficking online. In 2018, the federal government seized control of the website, shut it down and threw its owners in prison. The federal government swooped in and enforced federal criminal law. In fact, Section 230 was irrelevant in this case because the law provides no protection for platforms that contribute to criminal wrongdoing. The law also offers no protection for child exploitation or copyright violations.

Similarly, Section 230 offered no protection for online platform Silk Road: an anything-goes marketplace where users could sell guns and drugs and even contract for murder. The government shut down this website and enforced criminal law on its owners because, again, Section 230 does not shield platforms from federal criminal law.

These stories made headlines. But today, critics incorrectly claim that Section 230 protects bad online platforms from the enforcement of major crimes. Platforms that host horrendous content like child pornography or sex trafficking should and can be dealt with by the law. When law enforcement fails to adequately police the proliferation of this content, we must discuss how to better ensure they do so.

In a recent piece in Protocol, James Steyer and Bruce Reed claimed that Section 230 enables online harm. They argued that Section 230 protects bad platforms when they host content that exploits children. Thankfully, Steyer and Reed are entirely incorrect on this point. Section 230 offers no protection against such exploitation and, in fact, makes clear to judges that they should not grant immunity in such cases.

Steyer and Reed do raise some forms of content that platforms would not be held liable for hosting under Section 230, but we should not conflate the severity of content that is inappropriate for children with content that exploits children, like child pornography. If content is available on online platforms that some consider inappropriate for children, then a discussion over how media of all types can do more to protect kids is necessary. But upending the internet as we know it by eliminating Section 230 as Steyer and Reed suggest is not the nuanced approach for such important discussions.

Indeed, online platforms are able to moderate and make themselves more appropriate for children because of Section 230. From YouTube Kids to forced opt-in tagging and screening for child-friendly content, YouTube works on new content moderation guidelines each day, dedicated to making our children safe online.

Without Section 230, YouTube has no shield behind which to help build its family-friendly practices. Section 230 saves platforms like YouTube from a "moderator's dilemma," where they would need to refuse to moderate content on their sites to achieve court-awarded protections from conduit liability that predated Section 230, or end the practice of hosting user-created content easily and seamlessly.

Groups like Common Sense Media, founded by Steyer, rely on Section 230 to protect the public. By providing parents with useful resources for identifying whether movies and shows are appropriate for their kids, Common Sense relies on parents and kids to add their own reviews and ratings. Without Section 230, they would risk assuming liability for all statements made in these reviews if any of them were found to be defamatory.

The world is a better place when Common Sense Media can host reviews that empower parents, and that world is best realized when Section 230 remains the law of the land.

Steyer and Reed also mistakenly justify their drastic changes by citing the only amendment to Section 230, a 2018 law called the Fight Online Sex Trafficking Act. FOSTA's supporters have provided little evidence of its efficacy; their main argument, that the law enabled the FBI to takedown Backpage, is false because the FBI took down the site before FOSTA became law. FOSTA's author has also argued that it "eliminated 90 percent of sex-trafficking ads."

But these arguments have been decisively debunked by the Washington Post, and the track record of the law is particularly controversial. Many women's advocates have raised concerns that the law has put women in greater danger, and news reports from San Francisco show that after the passage of FOSTA, law enforcement saw a surge in street-based sex work and sex trafficking.

FOSTA shows that blaming the failures of law enforcement to stop criminals with Section 230 is not only a factual error: It also risks harming victims and decimating the widespread benefits of free speech online.

Since the inception of the internet, and even before the enactment of Section 230, any platform that engaged in an "anything goes" approach to user-generated content has been immune from the content of its users. For example, bookstores are not legally liable for any law-breaking content in the books they sell: the book's author is. The same is true for internet platforms.

But without Section 230, as was the case before its passage, if an internet platform engages in the type of content moderation required to protect children online, this protection evaporates and the platform suddenly becomes liable for every user's post. That leads to a clear disincentive for sites to moderate content.

When Congress enacted Section 230, it empowered platforms to remove "lewd, lascivious, or otherwise objectionable" content without becoming liable for all content on their sites. Just like a good samaritan is not liable for harm caused by trying to save someone on the street, platforms are not liable for harm caused when they seek to clean up their corners of the internet.

In fact, the only websites that have little to lose from a repeal of Section 230 are those like 8chan and other content cesspools of the internet that neglect to moderate content: Websites that actually focus on family-friendly content, like YouTube, KidsReads or Commons Sense Media, would likely not exist as they do today. If Section 230 were repealed, then what parents fear most would come to fruition — an increase in hate speech, violence, conspiracy videos and other harmful content online.

Steyer and Reed are right to bring attention to corners of the internet where children are at risk, and we must look for ways to ensure law enforcement has the tools it needs to tackle child exploitation. But by looking at Section 230 rather than at an increase in funding for enforcement against child exploitation, they are advocating for a policy that would severely underserve their goal of protecting children.

Section 230 is what has helped the internet become safer for children as it has grown and matured. As we search for ways to protect children from harm, we should all examine the true impacts of our policy prescriptions before presenting them as the golden ticket to a family-friendly future.

Twitter’s future is newsletters and podcasts, not tweets

With Revue and a slew of other new products, Twitter is trying hard to move past texting.

We started with 140 characters. What now?

Image: Liv Iko/Protocol

Twitter was once a home for 140-character missives about your lunch. Now, it's something like the real-time nerve center of the internet. But as for what Twitter wants to be going forward? It's slightly more complicated.

In just the last few months, Twitter has rolled out Fleets, a Stories-like feature; started testing an audio-only experience called Spaces; and acquired the podcast app Breaker and the video chat app Squad. And on Tuesday, Twitter announced it was acquiring Revue, a newsletter platform. The whole 140-characters thing (which is now 280 characters, by the way) is certainly not Twitter's organizing principle anymore. So what is?

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editor at large. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.

The key to American economic recovery? Automation.

A manufacturing sector revitalized by technology could help President Biden secure long-term economic growth, argue Rick Lazio and Myron Moser.

Technology could play a vital role in bolstering the economy.

Photo: Hans-Peter Merten/Getty Images

The economic impact of COVID-19 will be with us for years to come.

The U.S. economy is working its way back, albeit slowly, with the economy recovering only half of the more than 22 million jobs lost in March and April due to the pandemic and operating at 82 percent capacity compared to the first quarter of 2020.

Keep Reading Show less
Rick Lazio
Rick Lazio is currently a senior vice president at alliantgroup and is a former U.S. Representative from New York. After Congress, Rick moved to the private sector working for JPMorgan Chase as a managing director and then executive vice president.

The Capitol riots scrambled FCC Republicans’ Section 230 plans. What now?

The FCC's top tech agitators have been almost silent about Big Tech's Trump bans.

The commissioners will gingerly walk a line of condemning the tech platforms without seeming like they are condoning the rhetoric that led to Trump's suspensions or the takedown of Parler.

Photo: Jonathan Newton-Pool/Getty Images

Brendan Carr, one of the Federal Communications Commission's two Republicans, spent the better part of 2020 blasting Big Tech platforms for allegedly censoring conservative speech, appearing on Fox News and right-wing podcasts to claim that social media companies exhibited bias against President Trump and the GOP more broadly.

But in the weeks since Twitter, Facebook and YouTube suspended former President Trump and removed large swaths of his supporters in the wake of the violent riot on Capitol Hill, Carr has remained largely silent about the deplatforming, except to condemn the violence. "Political violence is completely unacceptable," Carr told reporters days after the riot. "It's clear to me President Trump bears responsibility."

Keep Reading Show less
Emily Birnbaum

Emily Birnbaum ( @birnbaum_e) is a tech policy reporter with Protocol. Her coverage focuses on the U.S. government's attempts to regulate one of the most powerful industries in the world, with a focus on antitrust, privacy and politics. Previously, she worked as a tech policy reporter with The Hill after spending several months as a breaking news reporter. She is a Bethesda, Maryland native and proud Kenyon College alumna.


Amazon’s head of Alexa Trust on how Big Tech should talk about data

Anne Toth, Amazon's director of Alexa Trust, explains what it takes to get people to feel comfortable using your product — and why that is work worth doing.

Anne Toth, Amazon's director of Alexa Trust, has been working on tech privacy for decades.

Photo: Amazon

Anne Toth has had a long career in the tech industry, thinking about privacy and security at companies like Yahoo, Google and Slack, working with the World Economic Forum and advising companies around Silicon Valley.

Last August she took on a new job as the director of Alexa Trust, leading a big team tackling a big question: How do you make people feel good using a product like Alexa, which is designed to be deeply ingrained in their lives? "Alexa in your home is probably the closest sort of consumer experience or manifestation of AI in your life," she said. That comes with data questions, privacy questions, ethical questions and lots more.

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editor at large. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.

Why Biden needs a National Technology Council

The U.S. government needs a more tightly coordinated approach to technology, argues Jonathan Spalter.

A coordinated effort to approach tech could help the White House navigate the future more easily.

Photo: Gage Skidmore/Flickr

The White House has a National Security Council and a National Economic Council. President-elect Joe Biden should move quickly to establish a National Technology Council.

Consumers are looking to the government to set a coherent and consistent 21st century digital policy that works for them. Millions of Americans still await public investments that will help connect their remote communities to broadband, while millions more — including many families with school-age children — still struggle to afford access.

Keep Reading Show less
Jonathan Spalter
Jonathan Spalter is the president and CEO of USTelecom – The Broadband Association.
Latest Stories