Source Code: Your daily look at what matters in tech.

source-codesource codeauthorEmily BirnbaumNoneWant your finger on the pulse of everything that's happening in tech? Sign up to get David Pierce's daily newsletter.64fd3cbe9f
×

Get access to Protocol

Your information will be used in accordance with our Privacy Policy

I’m already a subscriber
Politics

Reddit worries it’s going to be crushed in the fight against Big Tech

Reddit's general counsel says attacks on Section 230 ignore that there are different kinds of moderation.

Reddit

Reddit will be watching from the sidelines during the hearing.

Illustration: Reddit

Benjamin Lee, Reddit's general counsel, gets emotional about the potential loss of Section 230 of the Communications Decency Act. He paused for a long time before answering a question about whether he's frustrated over the state of play over the internet's favorite law, which Joe Biden and President Trump both want to repeal. "I apologize, I'm trying to hold back my full emotional weight of how I feel about this, on a certain level," he told Protocol.

Much of the conversation around Section 230 has revolved around the missteps of Facebook, Twitter and Google, which mediate the vast majority of online conversation and regularly draw ire from lawmakers on both sides of the aisle over alleged censorship or mishandling viral misinformation. On Wednesday, Mark Zuckerberg, Jack Dorsey and Sundar Pichai will testify before a Republican-led Senate panel about the future of 230 — and whether their actions have proven that it's time to change the law.

But smaller social media networks like Reddit rely on Section 230, too, and they're afraid that their business models will turn into collateral damage as the government tries to wrangle in the major platforms. Reddit will be watching from the sidelines during the hearing.

Protocol spoke with Lee about what's missing from the Section 230 debate, what reforming the law could mean for competition and why he's optimistic about where we go from here.

This interview has been edited and condensed for clarity.

What do you think is currently missing from the Section 230 debate?

What's missing currently is how important and critical Section 230 is to allowing competition against Big Tech, and encouraging platforms like Reddit to moderate in good faith and ultimately fulfill the promise and potential of the open internet. Section 230 was drafted in this really elegant way to protect not just providers of these services, but users of these services as well.

Section 230 reads, "No provider or user of an interactive computer service shall be treated as the publisher or speaker," so Section 230 protects the decisions of our users, Reddit users, as much as it protects Reddit's decisions themselves. It protects the decisions of our volunteer moderators, it protects the decisions of our users every time they vote on content, it protects their everyday decisions to curate content for our communities and protect their communities from unwanted content. That's something that we wanted to really emphasize in the context of our FCC comment, is how much we work in partnership with our communities of users; we wanted to drive that point home by filing our FCC comment similarly in partnership with one of our communities [r/LGBTQ].

You filed that comment with the r/LGBT community. What does 230 mean to that group?

Reddit has a unique, layered approach to content moderation that resembles our own democracy. We have site-wide content rules that apply to everyone, much akin to our federal laws. But just as we are a United States, a group of state governments, Reddit is a network of communities. Each community has its own volunteer moderation team.

Reddit basically allows every member of a community to vote on content. Voting is important to our democracy; it's just as important to moderation of the content shown to a Reddit community. So we work in partnership with every user on Reddit to protect all of the communities from unwanted content, including protecting communities who are coming together, much like the community highlighted in the FCC comments. These are communities that come together to provide a place, a safe place for authentic conversation. Section 230 basically allows us to do this.

Let's say Section 230 is repealed tomorrow. What would that mean immediately for Reddit?

It would be pretty bad. It's sometimes hard for me to even fully grasp the implications of it because Section 230 was devised at a time when most of the platforms that provided places for people to come together resembled Reddit more than the platforms we see today. Those platforms were put into this dilemma due to existing law that predates 230 — and that law basically rewarded the platforms that did not look for bad content.

If you actually took proactive measures like we did, and if your community took proactive measures like they do, then you were held fully liable for that content. That was the law, and that would become the law if 230 were repealed.

There's some disagreement among experts about this: For the most part, do you think changing the law would result in platforms working harder to remove horrible stuff from their platforms, or would it make the platforms wary of removing any content?

I think the irony of it is they're both correct in the sense the law creates a perverse incentive that pushes you to the far extremes, where on the one hand you either allow everything, you allow the worst type of content on the internet and you have to because that's the best way to avoid liability, or you avoid liability by restricting the amount of content so much that people aren't allowed to say anything actually meaningful or authentic.

These are communities that come together to provide a place, a safe place for authentic conversation. Section 230 basically allows us to do this.

Critics have pointed out that 230 allows platforms to be exempt from state and local criminal laws, even when they're knowingly facilitating illegal activity. Is there room for reform there?

I think the debate regarding the intersection of federal and state law regarding 230 is a complicated one. And I think that there's an opportunity there for some sophisticated thinking regarding how best to alleviate some of the tension there. Section 230 was never devised to protect providers from facilitating criminal conduct; that was never the intent of Section 230, and by and large it doesn't do that. In fact, I think that a lot of the appeals to modify 230 in that context unfortunately end up having the exact opposite consequences, primarily because most of at least what I've seen in this context is so focused on targeting limits to 230 that are premised on companies that moderate very differently than how we moderate.

They're focused on these giant, centrally moderated corporations that have an industrialized model of content moderation. So unfortunately, the sorts of targeted limits they're looking at for Section 230 ironically end up benefiting these largest companies by placing significant burden and cost on smaller companies like us.

Wednesday's congressional hearing will revolve around lawmakers' gripes with Facebook, Twitter and Google. Both sides will argue this is proof it's time to reform 230. Is it frustrating for Reddit to potentially have to face legislative repercussions for the actions of Big Tech?

It's frightening as well as sad. It's just fundamentally unfortunate, as it fails to recognize and appreciate that there are many different approaches to moderation, such as Reddit's approach to community moderation. I apologize, I'm trying to hold back my full emotional weight of how I feel about this, on a certain level. I think that right now, the world really needs a different approach to content moderation. Reddit is one of those approaches; we are really trying hard in a different approach that focuses so much on the community and our communities of users. We need more communities that create belonging.

And these proposals, to a large extent, are really designed in a way that have these large, unintended consequences on these alternative models, and have a real possibility of fundamentally destroying these other approaches. A good example in my mind is the PACT Act, which is so focused on putting process burdens on centrally moderated, large platforms, it ignores the implications on different content moderation approaches. It basically implicitly assumes all moderation decisions are being made by the service provider itself. It doesn't even contemplate that important decisions might be made by users — let alone in our situation, over 99% of the content moderation decisions on Reddit are done by users.

So because of the way that it's focused on these process penalties it imposes on providers, it creates a perverse incentive. Users are better off reporting to Reddit rather than to their own communities. So even something that seems good intentioned like the PACT Act would end up basically undermining and fundamentally, in the long run, destroying our layered moderation approaches. The PACT Act would end up turning us into Facebook.

This hearing will prominently feature Republican lawmakers claiming the major platforms are biased against them. What's your reaction to those anti-conservative bias allegations?

It does frustrate me because I think there's a serious conversation to be had about, for example, algorithmic bias. What can we do better to make what appear to be facially neutral algorithms less biased? That is a reasonable discussion and a reasonable technical discussion. But I feel like it's a nuanced discussion that is lost in this debate right now.

I think there are some hard questions that we need to answer with regards to how best to architect these centrally moderated approaches, but I'm not certain that these debates are the most nuanced way in which to come to a clear direction with regards to how to deal with these sorts of issues.

The PACT Act would end up turning us into Facebook.

After facing years of criticism over the proliferation of hate speech, Reddit recently majorly expanded its rules against hate speech. Hasn't that effort made your model more "centralized"? How does Section 230 play into that?

Section 230 allowed us to do exactly what we're doing right now with regards to hate. Our campaign against hateful content is all built around our partnership with our communities. We literally cowrote our hate policies with consultation with the communities. The tools that we built to help the community and help us hunt down hateful content were all facilitated through Section 230.

Are there any proposals you've seen that you would get behind?

We're always open to approaches that require providers to be thoughtful about issues such as transparency. We're also open to evaluating different ways of approaching 230 that are more mindful of community-based approaches to content moderation and to not end up undermining such alternatives to the dominant social networks today.

Personally I've seen bits and pieces of this in a variety of the different proposals. For example, the transparency pieces I've seen — there are aspects of those that I think are quite realistic and reasonable.

What is Reddit's position on the EARN IT Act?

I mean, child sexual abuse material is already illegal. A provider that mishandles CSAM [child sexual abuse] material is not protected under Section 230. Reddit takes CSAM very, very seriously. And frankly, I think most providers do as well.

I think that if we as a society want to take the issue of CSAM more seriously, there are a variety of other proposals that more realistically address the issue of CSAM [and] provide the enforcement resources. We've highlighted these different materials. EARN IT it doesn't feel like a serious attempt to actually address CSAM. It feels like there's a lot of other issues that are built into EARN IT, other than addressing child safety, and it raises a lot of the issues we've been talking about with regards to Section 230.

Reddit has a reputation of hosting hate speech and bigoted rhetoric that Democrats have said they're concerned about. If it's not Section 230, do you think there are any legislative interventions needed to have the government step in on this?

That's a hard question. I feel that Reddit has struggled with the balance between fighting hate online and balancing it against freedom of expression, just as much as our whole country has tried to struggle with that balance. I feel like especially this year, we've tried to take aggressive steps in that area that move the ball forward with regards to how we want to see our communities evolve.

As to what Congress can do, I feel like that's a harder question. Congress has to always balance what they're capable of doing with regards to hate with the current interpretation of freedom of expression under the First Amendment, so their balance there is one that kind of goes to the balance literally between Congress and the Supreme Court and the rule of law. It's a very, very difficult topic.

What's your response to those who say Reddit's decentralized content moderation model is what has enabled hate to spread in the past — that it put so much in the hands of community moderators who set their own guidelines?

We've learned a lot about how to empower our communities in a way to allow them to grow in a positive direction. But we've also learned a lot about how communities can be weaponized against each other and can be used to undermine the sort of belonging that is fundamental to our mission. And through that, we've — with the support of 230 behind us — been able to empower the right sorts of behaviors within these communities. Fundamentally, it's because 230 protects our ability to try these different approaches.

Realistically, what do you predict will happen to 230 over the next four years — under either administration?

I know some academics have been somewhat vocal about their pessimism about the fate of 230. I am more hopeful. Section 230 is unique in the entire world. What would be super unfortunate is if we end up throwing out 230 in an effort to punish the largest internet players for their perceived or real abuse of their dominance.

Unraveling 230 would basically further ensure that dominance, while undermining the ability of smaller companies like Reddit to challenge that dominance with alternative models of innovation.

You say repealing Section 230 would harm competition because of the cost, right? The major tech companies could afford the barrage of lawsuits it might unleash whereas smaller companies, maybe like Reddit, could not?

No question that there's part of it that's the significant burden and cost placed on smaller competitors. A startup who's trying to innovate in this space won't survive a swath of lawsuits. So that's definitely part of it. But there's also this other part, which is this notion that when [lawmakers] focus on these centrally moderated models, they [unintentionally] create incentives that make other platforms start to resemble these centrally moderated models. And they end up cementing that as the approach, which I don't think is a consequence that either is intended or that, in the long term, we as a society want.

It's funny — I do feel that these discussions are far more nuanced once you're one-on-one with any of the lawmakers and policymakers. So my hope is that eventually this discussion becomes more nuanced than it currently is right now in the public sphere.

Protocol | China

China’s era of Big Tech Overwork has ended

Tech companies fear public outcry as much as they do regulatory crackdowns.

Chinese tech workers are fed up. Companies fear political and publish backlashes.

Photo: Susan Fisher Plotner/Getty Images

Two years after Chinese tech workers started a decentralized online protest against grueling overtime work culture, and one year after the plight of delivery workers came under the national spotlight, a chorus of Chinese tech giants have finally made high-profile moves to end the grueling work schedules that many believe have fueled the country's spectacular tech boom — and that many others have criticized as exploitative and cruel.

Over the past two months, at least four Chinese tech giants have announced plans to cancel mandatory overtime; some of the changes are companywide, and others are specific to business units. ByteDance, Kuaishou and Meituan's group-buying platform announced the end of a policy called "Big/Small Week," where a six-day workweek is followed by a more moderate schedule. In early June, a game studio owned by Tencent rolled out a policy that mandated employees punch out at 6 p.m. every Wednesday and take the weekends off.

Keep Reading Show less
Shen Lu

Shen Lu is a reporter with Protocol | China. She has spent six years covering China from inside and outside its borders. Previously, she was a fellow at Asia Society's ChinaFile and a Beijing-based producer for CNN. Her writing has appeared in Foreign Policy, The New York Times and POLITICO, among other publications. Shen Lu is a founding member of Chinese Storytellers, a community serving and elevating Chinese professionals in the global media industry.

Over the last year, financial institutions have experienced unprecedented demand from their customers for exposure to cryptocurrency, and we've seen an inflow of institutional dollars driving bitcoin and other cryptocurrencies to record prices. Some banks have already launched cryptocurrency programs, but many more are evaluating the market.

That's why we've created the Crypto Maturity Model: an iterative roadmap for cryptocurrency product rollout, enabling financial institutions to evaluate market opportunities while addressing compliance requirements.

Keep Reading Show less
Caitlin Barnett, Chainanalysis
Caitlin’s legal and compliance experience encompasses both cryptocurrency and traditional finance. As Director of Regulation and Compliance at Chainalysis, she helps leading financial institutions strategize and build compliance programs in order to adopt cryptocurrencies and offer new products to their customers. In addition, Caitlin helps facilitate dialogue with regulators and the industry on key policy issues within the cryptocurrency industry.
Power

Brownsville, we have a problem

The money and will of Elon Musk are reshaping a tiny Texas city. Its residents are divided on his vision for SpaceX, but their opinion may not matter at all.

When Musk chose Cameron County, he changed its future irrevocably.

Photo: Verónica G. Cárdenas for Protocol

In Boca Chica, Texas, the coastal prairie stretches to the horizon on either side of the Gulf of Mexico, an endless sandbar topped with floating greenery, wheeling gulls and whipping gusts of wind.

Far above the sea on a foggy March day, the camera feed on the Starship jerked and then froze on an image of orange flames shooting into the gray. From the ground below, onlookers strained to see through the opaque sky. After a moment of quiet, jagged edges of steel started to rain from the clouds, battering the ground near the oceanside launch pad, ripping through the dunes, sinking deep into the sand and flats.

Keep Reading Show less
Anna Kramer

Anna Kramer is a reporter at Protocol (Twitter: @ anna_c_kramer, email: akramer@protocol.com), where she writes about labor and workplace issues. Prior to joining the team, she covered tech and small business for the San Francisco Chronicle and privacy for Bloomberg Law. She is a recent graduate of Brown University, where she studied International Relations and Arabic and wrote her senior thesis about surveillance tools and technological development in the Middle East.

People

Facebook’s push to protect young users is a peek at the future of social

More options, more proactive protections, fewer one-size-fits-all answers for being a person on the internet.

Social media companies are racing to find ways to protect underage people on their apps.

Image: Alexander Shatov/Unsplash

Social media companies used to see themselves as open squares, places where everyone could be together in beautiful, skipping-arm-in-arm harmony. But that's not the vision anymore.

Now, Facebook and others are going private. They're trying to rebuild around small groups and messaging. They're also trying to figure out how to build platforms that work for everyone, that don't try to apply the same set of rules to billions of people around the world, that bring everyone together but on each user's terms. It's tricky.

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editor at large. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.

Power

Who owns that hot startup? These insiders want to clear it up.

Cap tables are fundamental to startups. So 10 law firms and startup software vendors are teaming up to standardize what they tell you about investors' stakes.

Cap tables describe the ownership of shares in a startup, but they aren't standardized.

Illustration: Protocol

Behind every startup, there's a cap table. Startups have to start keeping track of who owns what, from the moment they're created, to fundraising from venture capitalists, to an eventual IPO or acquisition.

"Everything that happens that is a sexy thing that's important to the tech world, it really is something having to do with the cap table," said David Wang, chief innovation officer at the Wilson Sonsini Goodrich & Rosati law firm.

Keep Reading Show less
Biz Carson

Biz Carson ( @bizcarson) is a San Francisco-based reporter at Protocol, covering Silicon Valley with a focus on startups and venture capital. Previously, she reported for Forbes and was co-editor of Forbes Next Billion-Dollar Startups list. Before that, she worked for Business Insider, Gigaom, and Wired and started her career as a newspaper designer for Gannett.

Latest Stories