pipelinepipelineauthorBiz CarsonNoneDo you know what's going on in the venture capital and startup world? Get Tomio Geron's newsletter every Saturday.021fce003e
×

Get access to Protocol

I’ve already subscribed

Will be used in accordance with our Privacy Policy

People

Moderation can’t wait: The challenges startups like Clubhouse face trying to build community

A former head of moderation at Quora says it's about more than inviting diverse voices to a network: It's about moderating them all to promote inclusion.

Tatiana Estévez

Culture will develop whether you want it to or not, says former Quora head of moderation Tatiana Estévez. That's why apps like Clubhouse need to start moderation and community-building early.

Photo: Courtesy of Tatiana Estévez

In just four months, Clubhouse has attracted users from Oprah Winfrey to the elitist circles of venture capital. The buzzy, still-in-beta app is an audio-only social network where people can host rooms to talk or just drop in for a listen — the conversations range from product growth strategies, to Black Lives Matter, to Jesus and Socrates.

But like any social network, the company isn't immune from bad behavior, even with the caliber of its user base and being still in beta. After an argument on Twitter between the New York Times journalist Taylor Lorenz and venture capitalist Balaji Srinivasan boiled over into a Clubhouse session, the app has faced criticism for not adequately addressing reports of harassment. Not great for an in-beta startup valued at $100 million.

That perhaps explains why Clubhouse revealed on Friday its updated community guidelines along with new tools to block and report specific cases of harassment. The accompanying blog post, however, seems to raise more questions than it answers about how it plans to police its platform and protect users.

To give Clubhouse its due, it only has two full-time employees and is incredibly young. But even just months into its existence is too late for Clubhouse to be thinking about how it is going to address harassment and moderation, said Tatiana Estévez, who previously led moderation and community for Quora.

"It is not something that you can push down the line because in a way it happens anyway," Estévez told Protocol. "Culture will develop whether you want it to or not. Tone will develop whether you want it to or not." She added: "I think moderation and community should be part of the product. It should be seen as an integral part of the product from the onset."

Protocol spoke with Estévez, who is currently working on "something new," after she wrote a widely shared Twitter thread on the challenges Clubhouse and other networks face when it comes to moderation and building an inclusive community — a problem that might only be amplified in a conversational setting where we know that it's easier for men to talk over women.

In an interview with Protocol, Estévez talked about the challenges of audio vs. written moderation, what founders should be considering when building a community, and whether it's possible to even have a safe space for conversation online.

The conversation took place before Clubhouse revealed its updated community guidelines, and Estévez declined to comment on its public statements.

This interview has been edited and condensed for clarity.

I was following the Clubhouse debate, and there were lots of concerns about the talks that were happening on the platform. And as you pointed out in your Twitter thread, audio is a hard medium to moderate. So why did the Clubhouse situation catch your attention?

I had no strong feelings about the actual debate that was happening, but it made me think a lot about the complexities with audio and how difficult it is to track what the actual issue is compared to reading a conversation. We all know in real life conversations what the dynamics are like. When it comes to people you don't know on the internet, we know that the worst of us tends to get amplified. It really made me think about those aspects and what are the possible solutions.

So you had worked at Quora where you were focused more on written moderation. How is audio a different beast to moderate?

So I haven't moderated audio, so this is speculation on my part, but you're not going to think as hard when you're having a conversation in audio. Even in a fast-paced online conversation, if you're writing something, you can pause, you can think, you're going to backtrack, edit. Whereas in conversation, you're saying it as you think. And that's going to be difficult.

You also have another aspect, which doesn't happen as much when you're writing, and that's tone. You could say something perfectly nice that with the right inflection, with the right snark level, it's going to be totally horrible. A few people have said like, could you capture this all in transcripts? But you can't capture that tone. Also, if you're going quickly back and forth, it might not be obvious from what one person is saying that that's actually a reaction to what someone else said a few minutes ago. It takes a lot longer also to listen. It seems like a higher-cost kind of moderation with a lot of quite easy-to-miss stuff.

To your point on tone, though, I feel like that's something in writing that is harder to moderate because in my writing, you don't get to capture my tone. So if I am being sarcastic, it may not come across as sarcasm.

There's loads of nuance in tone. It's a difficult problem because we're used to often talking about the extremes when we talk about problems with online moderation — like the really bad sexist comments or racist comments, and we all agree, that's bad. We want those people off. And that should be a problem that we're really dealing with.

But there's the other issue of the way people interact with each other: people that aren't intending to be assholes, but are making the place pretty hostile. And it's not necessarily obvious.

I've tried to talk about it by framing it into language that is easy. By giving it labels, I find people tend to understand it better. So when you look at the conversation, sometimes it's really not apparent what the issue is. But if I explain "this is gaslighting" or "this is mansplaining" or "this is sealioning" it starts to become clearer that there is a problem there, and it is a hostile environment. And this often happens with men and women interacting, especially men replying to women in a way that really makes women feel super put off with the internet. That is why places aren't as inclusive as they should be, even with the intent of wanting diversity in these spaces.

How should companies that are just getting started, like Clubhouse, be thinking about building a culture that is inclusive from the beginning?

So that's the most important thing — that they are thinking about it from the get-go. It is not something that you can push down the line because in a way it happens anyway: Culture will develop whether you want it to or not; tone will develop whether you want it to or not. So, if you're a founder, you're going to want to think about: What kind of tone do I want this place to be? Do I want this place to be inclusive? Free speech is important, but is it at any cost? What kind of rules and policies do I need to make sure that happen?

It doesn't have to be set in stone. In fact, it shouldn't be set in stone. It should be like, what kind of place do I want this to be? What kind of voices do I want? I could see scenarios where there's a very sort of a specific community for a very specific demographic, where inclusivity might not be a factor for that, but they still need to think about that and make that decision. For example, if it's an LGBTQ+ forum, they don't have to be inclusive to people who are anti-LGBTQ. Right? So you need to think about that from the onset, and then you need to think about who you're inviting in.

You also need to be really aware as [people join], are the voices that you wanted to speak speaking? Or is it the same people that are being highlighted? We've seen this in so many social sites that the default is men tend to speak more than women, and they tend to speak louder, and it's particularly white men. And we know we've seen that over and over again, and it's not an accident. It's not that it just happens to be that men like talking in social apps, there's a reason why [others] are being pushed out.

Do you find audio is a harder platform for this than the written word? I'm wondering if the keyboard is kind of an equalizer, or if you have found that men still dominate a lot of the online written conversations.

I think there's a ton of problems with online conversation, whether it's written or audio. There is something about the written word, which allows women to maybe address [others] with a bit more of equality. With audio, you have a few extra elements, for instance, men interrupting women. That's one of the interesting dynamics.

I don't know if you've ever had this experience, but you're having a meeting, there's several men, and you've been trying to speak out, but you keep being interrupted and men talk over you. It happens less these days, but still, I've experienced that quite a few times. And then I have to speak up because otherwise, I'm not going to get to say what I want to say. So I speak up, and then the men get upset, and say "let me speak, let me finish."

You have this dynamic where women are being interrupted all the time, but men are the ones being sensitive to this happening because they're not used to it. And so I did see this on Clubhouse.

I thought you made an interesting point that it's not just having diverse voices invited into the community, it's making sure that they're actually having a voice within the community. How much of that requires moderation versus proactive assistance on the part of the company?

I'm a community manager as well as a moderator. It has to be a combination. I think someone like me can help set tone a lot and set those values.

You also want the founders to be there setting those values, and it has to be something that's coming from the very top and has to be something that comes from the company. You're right, you need to look at product solutions as well, and that's not my area of expertise. But it's difficult. It's nuanced.

I think it's difficult because very few people look at these things, these conversations the way I do. I know a lot of men that are very sympathetic to this, but they don't see it, they don't hear it, and they don't always hear it when they do it, which is always difficult because they're often really well intentioned. Luckily, I know quite a few men that are willing to be educated on this, and are willing to hear this and accept what I'm saying. And if that happens, then I think then you can start to get those products solutions because when you really start to understand it, then it becomes clearer.

Do you think any company handles moderation well? These days we see Reddit in the news trying to address problems, while Facebook is being audited by civil rights groups who say it's dragging its feet.

I think what happens with a lot of companies is that they are in reactive mode. It's like when it becomes a problem, then they try and solve it. It's very difficult once your community tone is established, a culture is established, to then go and say, "Hey, we don't want this anymore." Because the power of that community is like, "Well, who are you to say we can't talk like this? We've decided that this is the tone." It's part of the identity of that product. So I think that's been the biggest problem with these companies in general.

The reactiveness of the companies?

Yes — they are dealing with it well after it's a problem. I think moderation and community should be part of the product. It should be seen as an integral part of the product from the onset. For every single feature, every single tool, every single addition, everyone should be asking the moderation team and saying: How can this be abused? Is this a vector of abuse? For what it's worth, nearly everything we use is a vector of abuse in the right hand, that's just the world. And it's like, what is the likelihood of that happening? How can we fix it? That's at the product development stage.

You don't think about it once you've had a droves of women or Black people leaving the product, having to go somewhere else. You have to think about [it] from the onset because the damage has been done to a certain extent if you're trying to deal with the result.

What else do you think people need to understand about moderating and creating these inclusive communities, which are more important every day?

I'd really like to stress that the online experience is not the same for everyone. And at the moment, we have a heavy online experience where the default is white men. It's kind of forcing people to adopt the tone of white men, or just go somewhere else and not be part of that. And a lot of women, a lot of people of color, Black people, LGBTQ, do not feel safe in this default white male space. And there's no reason for that. It's not OK anymore that people feel that these spaces are inherently hostile to them.

And you need to talk to these people — women, different marginalized groups, LGBTQ+, people of color, particularly Black people — and you need to ask them: What are the behaviors that are making you not want to be here? What would make this space safe for you? And I know there's some people who dislike that term "safe space." But I feel that's what we all really want. At the end of the day, we want a safe space to talk.

That's a great point. A lot of people hear safe space, and they think, "Oh you're cushioning people and keeping them from the truth" and that's not actually what that means at all.

If people knew me online, you would know that I'm very comfortable talking about demanding subjects and challenging subjects. And I think we can do that in a nice way. I think we can do that by being respectful to each other. I truly believe that it just requires a little bit of empathy for each other.

People

Beeper built the universal messaging app the world needed

It's an app for all your social apps. And part of an entirely new way to think about chat.

Beeper is an app for all your messaging apps, including the hard-to-access ones.

Image: Beeper

Eric Migicovsky likes to tinker. And the former CEO of Pebble — he's now a partner at Y Combinator — knows a thing or two about messaging. "You remember on the Pebble," he asked me, "how we had this microphone, and on Android you could reply to all kinds of messages?" Migicovsky liked that feature, and he especially liked that it didn't care which app you used. Android-using Pebble wearers could speak their replies to texts, Messenger chats, almost any notification that popped up.

That kind of universal, non-siloed approach to messaging appealed to Migicovsky, and it didn't really exist anywhere else. "Remember Trillian from back in the day?" he asked, somewhat wistfully. "Or Adium?" They were the gold-standard of universal messaging apps; users could log in to their AIM, MSN, GChat and Yahoo accounts, and chat with everyone in one place.

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editor at large. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.

Politics

Facebook’s Oversight Board won’t save it from the Trump ban backlash

The Board's decision on whether to reinstate Trump could set a new precedent for Facebook. But does the average user care what the Board has to say?

A person holds a sign during a Free Speech Rally against tech companies, on Jan. 20 in California.

Photo: Valerie Macon/Getty Images

Two weeks after Facebook suspended former President Donald Trump's account indefinitely, Facebook answered a chorus of calls and referred the case to its newly created Oversight Board for review. Now, the board has 90 days to make a call as to whether Trump stays or goes permanently. The board's decision — and more specifically, how and why it arrives at that decision — could have consequences not only for other global leaders on Facebook, but for the future of the Board itself.

Facebook created its Oversight Board for such a time as this — a time when it would face a controversial content moderation decision and might need a gut check. Or a fall guy. There could be no decision more controversial than the one Facebook made on Jan. 7, when it decided to muzzle one of the most powerful people in the world with weeks remaining in his presidency. It stands to reason, then, that Facebook would tap in its newly anointed refs on the Oversight Board both to earnestly review the call and to put a little distance between Facebook and the decision.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
People

The future of the cell phone, according to the man who invented it

Martin Cooper on 5G, AI, and why sometimes in tech it's helpful to have an enemy.

Martin Cooper with his original DynaTAC cell phone.

Photo: Ted Soqui/Getty Images

Martin Cooper helped invent one of the most consequential and successful products in history: the cell phone. And almost five decades after he made the first public cell phone call, on a 2-pound brick of a device called the DynaTAC, he's written a book about his career called "Cutting the Cord: The Cell Phone Has Transformed Humanity." In it he tells the story of the cell phone's invention, and looks at how it has changed the world and will continue to do so.

Cooper came on the Source Code Podcast to talk about his time at Motorola, the process of designing the first-ever cell phone, whether today's tech giants are monopolies and why he's bullish on the future of AI.

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editor at large. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.

Politics

This is the future of the FTC

President Joe Biden has named Becca Slaughter acting chair of the FTC. In conversation with Protocol, she laid out her priorities for the next four years.

FTC commissioner Becca Slaughter may be President Biden's pick for FTC chair.

Photo: David Becker/Getty Images

Becca Slaughter made a name for herself last year when, as a commissioner for the Federal Trade Commission, she breastfed her newborn baby during video testimony before the Senate, raising awareness about the plight of working parents during the pandemic.

But on Thursday, Slaughter's name began circulating for other reasons: She was just named as President Joe Biden's pick for acting chair of the FTC, an appointment that puts Slaughter at the head of antitrust investigations into tech giants, including Facebook.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
People

What $9 billion would do for the Technology Modernization Fund

The Alliance for Digital Innovation's Matthew T. Cornelius looks at how a new administration's big investment could alter the fund he helped set up.

The funding itself is only half the battle.

Photo: Joshua Sukoff/Unsplash

The Biden administration wants to give the Technology Modernization Fund a $9 billion payday. In doing so, they could change what the fund actually does.

Matthew T. Cornelius, now the Alliance for Digital Innovation's executive director, was instrumental in getting the fund off the ground back in 2018. As a senior adviser for technology and cybersecurity policy at the White House's Office of Management and Budget, he helped make some of the fund's first investments in government IT modernization. At the time, though, there was only about $100 million in the fund.

Keep Reading Show less
Kevin McAllister

Kevin McAllister ( @k__mcallister) is an associate editor at Protocol, leading the development of Braintrust. Prior to joining the team, he was a rankings data reporter at The Wall Street Journal, where he oversaw structured data projects for the Journal's strategy team.

Latest Stories