People

Moderation can’t wait: The challenges startups like Clubhouse face trying to build community

A former head of moderation at Quora says it's about more than inviting diverse voices to a network: It's about moderating them all to promote inclusion.

Tatiana Estévez

Culture will develop whether you want it to or not, says former Quora head of moderation Tatiana Estévez. That's why apps like Clubhouse need to start moderation and community-building early.

Photo: Courtesy of Tatiana Estévez

In just four months, Clubhouse has attracted users from Oprah Winfrey to the elitist circles of venture capital. The buzzy, still-in-beta app is an audio-only social network where people can host rooms to talk or just drop in for a listen — the conversations range from product growth strategies, to Black Lives Matter, to Jesus and Socrates.

But like any social network, the company isn't immune from bad behavior, even with the caliber of its user base and being still in beta. After an argument on Twitter between the New York Times journalist Taylor Lorenz and venture capitalist Balaji Srinivasan boiled over into a Clubhouse session, the app has faced criticism for not adequately addressing reports of harassment. Not great for an in-beta startup valued at $100 million.

That perhaps explains why Clubhouse revealed on Friday its updated community guidelines along with new tools to block and report specific cases of harassment. The accompanying blog post, however, seems to raise more questions than it answers about how it plans to police its platform and protect users.

To give Clubhouse its due, it only has two full-time employees and is incredibly young. But even just months into its existence is too late for Clubhouse to be thinking about how it is going to address harassment and moderation, said Tatiana Estévez, who previously led moderation and community for Quora.

"It is not something that you can push down the line because in a way it happens anyway," Estévez told Protocol. "Culture will develop whether you want it to or not. Tone will develop whether you want it to or not." She added: "I think moderation and community should be part of the product. It should be seen as an integral part of the product from the onset."

Protocol spoke with Estévez, who is currently working on "something new," after she wrote a widely shared Twitter thread on the challenges Clubhouse and other networks face when it comes to moderation and building an inclusive community — a problem that might only be amplified in a conversational setting where we know that it's easier for men to talk over women.

In an interview with Protocol, Estévez talked about the challenges of audio vs. written moderation, what founders should be considering when building a community, and whether it's possible to even have a safe space for conversation online.

The conversation took place before Clubhouse revealed its updated community guidelines, and Estévez declined to comment on its public statements.

This interview has been edited and condensed for clarity.

I was following the Clubhouse debate, and there were lots of concerns about the talks that were happening on the platform. And as you pointed out in your Twitter thread, audio is a hard medium to moderate. So why did the Clubhouse situation catch your attention?

I had no strong feelings about the actual debate that was happening, but it made me think a lot about the complexities with audio and how difficult it is to track what the actual issue is compared to reading a conversation. We all know in real life conversations what the dynamics are like. When it comes to people you don't know on the internet, we know that the worst of us tends to get amplified. It really made me think about those aspects and what are the possible solutions.

So you had worked at Quora where you were focused more on written moderation. How is audio a different beast to moderate?

So I haven't moderated audio, so this is speculation on my part, but you're not going to think as hard when you're having a conversation in audio. Even in a fast-paced online conversation, if you're writing something, you can pause, you can think, you're going to backtrack, edit. Whereas in conversation, you're saying it as you think. And that's going to be difficult.

You also have another aspect, which doesn't happen as much when you're writing, and that's tone. You could say something perfectly nice that with the right inflection, with the right snark level, it's going to be totally horrible. A few people have said like, could you capture this all in transcripts? But you can't capture that tone. Also, if you're going quickly back and forth, it might not be obvious from what one person is saying that that's actually a reaction to what someone else said a few minutes ago. It takes a lot longer also to listen. It seems like a higher-cost kind of moderation with a lot of quite easy-to-miss stuff.

To your point on tone, though, I feel like that's something in writing that is harder to moderate because in my writing, you don't get to capture my tone. So if I am being sarcastic, it may not come across as sarcasm.

There's loads of nuance in tone. It's a difficult problem because we're used to often talking about the extremes when we talk about problems with online moderation — like the really bad sexist comments or racist comments, and we all agree, that's bad. We want those people off. And that should be a problem that we're really dealing with.

But there's the other issue of the way people interact with each other: people that aren't intending to be assholes, but are making the place pretty hostile. And it's not necessarily obvious.

I've tried to talk about it by framing it into language that is easy. By giving it labels, I find people tend to understand it better. So when you look at the conversation, sometimes it's really not apparent what the issue is. But if I explain "this is gaslighting" or "this is mansplaining" or "this is sealioning" it starts to become clearer that there is a problem there, and it is a hostile environment. And this often happens with men and women interacting, especially men replying to women in a way that really makes women feel super put off with the internet. That is why places aren't as inclusive as they should be, even with the intent of wanting diversity in these spaces.

How should companies that are just getting started, like Clubhouse, be thinking about building a culture that is inclusive from the beginning?

So that's the most important thing — that they are thinking about it from the get-go. It is not something that you can push down the line because in a way it happens anyway: Culture will develop whether you want it to or not; tone will develop whether you want it to or not. So, if you're a founder, you're going to want to think about: What kind of tone do I want this place to be? Do I want this place to be inclusive? Free speech is important, but is it at any cost? What kind of rules and policies do I need to make sure that happen?

It doesn't have to be set in stone. In fact, it shouldn't be set in stone. It should be like, what kind of place do I want this to be? What kind of voices do I want? I could see scenarios where there's a very sort of a specific community for a very specific demographic, where inclusivity might not be a factor for that, but they still need to think about that and make that decision. For example, if it's an LGBTQ+ forum, they don't have to be inclusive to people who are anti-LGBTQ. Right? So you need to think about that from the onset, and then you need to think about who you're inviting in.

You also need to be really aware as [people join], are the voices that you wanted to speak speaking? Or is it the same people that are being highlighted? We've seen this in so many social sites that the default is men tend to speak more than women, and they tend to speak louder, and it's particularly white men. And we know we've seen that over and over again, and it's not an accident. It's not that it just happens to be that men like talking in social apps, there's a reason why [others] are being pushed out.

Do you find audio is a harder platform for this than the written word? I'm wondering if the keyboard is kind of an equalizer, or if you have found that men still dominate a lot of the online written conversations.

I think there's a ton of problems with online conversation, whether it's written or audio. There is something about the written word, which allows women to maybe address [others] with a bit more of equality. With audio, you have a few extra elements, for instance, men interrupting women. That's one of the interesting dynamics.

I don't know if you've ever had this experience, but you're having a meeting, there's several men, and you've been trying to speak out, but you keep being interrupted and men talk over you. It happens less these days, but still, I've experienced that quite a few times. And then I have to speak up because otherwise, I'm not going to get to say what I want to say. So I speak up, and then the men get upset, and say "let me speak, let me finish."

You have this dynamic where women are being interrupted all the time, but men are the ones being sensitive to this happening because they're not used to it. And so I did see this on Clubhouse.

I thought you made an interesting point that it's not just having diverse voices invited into the community, it's making sure that they're actually having a voice within the community. How much of that requires moderation versus proactive assistance on the part of the company?

I'm a community manager as well as a moderator. It has to be a combination. I think someone like me can help set tone a lot and set those values.

You also want the founders to be there setting those values, and it has to be something that's coming from the very top and has to be something that comes from the company. You're right, you need to look at product solutions as well, and that's not my area of expertise. But it's difficult. It's nuanced.

I think it's difficult because very few people look at these things, these conversations the way I do. I know a lot of men that are very sympathetic to this, but they don't see it, they don't hear it, and they don't always hear it when they do it, which is always difficult because they're often really well intentioned. Luckily, I know quite a few men that are willing to be educated on this, and are willing to hear this and accept what I'm saying. And if that happens, then I think then you can start to get those products solutions because when you really start to understand it, then it becomes clearer.

Do you think any company handles moderation well? These days we see Reddit in the news trying to address problems, while Facebook is being audited by civil rights groups who say it's dragging its feet.

I think what happens with a lot of companies is that they are in reactive mode. It's like when it becomes a problem, then they try and solve it. It's very difficult once your community tone is established, a culture is established, to then go and say, "Hey, we don't want this anymore." Because the power of that community is like, "Well, who are you to say we can't talk like this? We've decided that this is the tone." It's part of the identity of that product. So I think that's been the biggest problem with these companies in general.

The reactiveness of the companies?

Yes — they are dealing with it well after it's a problem. I think moderation and community should be part of the product. It should be seen as an integral part of the product from the onset. For every single feature, every single tool, every single addition, everyone should be asking the moderation team and saying: How can this be abused? Is this a vector of abuse? For what it's worth, nearly everything we use is a vector of abuse in the right hand, that's just the world. And it's like, what is the likelihood of that happening? How can we fix it? That's at the product development stage.

You don't think about it once you've had a droves of women or Black people leaving the product, having to go somewhere else. You have to think about [it] from the onset because the damage has been done to a certain extent if you're trying to deal with the result.

What else do you think people need to understand about moderating and creating these inclusive communities, which are more important every day?

I'd really like to stress that the online experience is not the same for everyone. And at the moment, we have a heavy online experience where the default is white men. It's kind of forcing people to adopt the tone of white men, or just go somewhere else and not be part of that. And a lot of women, a lot of people of color, Black people, LGBTQ, do not feel safe in this default white male space. And there's no reason for that. It's not OK anymore that people feel that these spaces are inherently hostile to them.

And you need to talk to these people — women, different marginalized groups, LGBTQ+, people of color, particularly Black people — and you need to ask them: What are the behaviors that are making you not want to be here? What would make this space safe for you? And I know there's some people who dislike that term "safe space." But I feel that's what we all really want. At the end of the day, we want a safe space to talk.

That's a great point. A lot of people hear safe space, and they think, "Oh you're cushioning people and keeping them from the truth" and that's not actually what that means at all.

If people knew me online, you would know that I'm very comfortable talking about demanding subjects and challenging subjects. And I think we can do that in a nice way. I think we can do that by being respectful to each other. I truly believe that it just requires a little bit of empathy for each other.

Protocol | Workplace

Instacart workers are on strike. How far can it get them?

Instacart activists want a nationwide strike to start today, but many workers are too afraid of the company and feel they can't afford a day off of work.

Gig workers protest in front of an Amazon facility in 2020.

Photo: Michael Nagle/Bloomberg via Getty Images

Starting today, an Instacart organizing group is asking the app's gig workers to go on a nationwide strike to demand better payment structures, benefits and other changes to the way the company treats its workers — but if past strikes are any indication, most Instacart users probably won't even notice.

The majority of Instacart workers on forums like Reddit and Facebook appear either unaware of the planned strike or don't plan to participate because they are skeptical of its power, afraid of retaliation from the company or are too reliant on what they do make from the app to be able to afford to take even one day off of the platform. "Not unless someone is going to pay my bills," "It will never work, you will never be able to get every shopper to organize" and "Last time there was a 'strike' Instacart took away our quality bonus pay," are just a few of the comments Instacart shoppers have left in response to news of the strike.

Keep Reading Show less
Anna Kramer

Anna Kramer is a reporter at Protocol (Twitter: @ anna_c_kramer, email: akramer@protocol.com), where she writes about labor and workplace issues. Prior to joining the team, she covered tech and small business for the San Francisco Chronicle and privacy for Bloomberg Law. She is a recent graduate of Brown University, where she studied International Relations and Arabic and wrote her senior thesis about surveillance tools and technological development in the Middle East.

The way we work has fundamentally changed. COVID-19 upended business dealings and office work processes, putting into hyperdrive a move towards digital collaboration platforms that allow teams to streamline processes and communicate from anywhere. According to the International Data Corporation, the revenue for worldwide collaboration applications increased 32.9 percent from 2019 to 2020, reaching $22.6 billion; it's expected to become a $50.7 billion industry by 2025.

"While consumers and early adopter businesses had widely embraced collaborative applications prior to the pandemic, the market saw five years' worth of new users in the first six months of 2020," said Wayne Kurtzman, research director of social and collaboration at IDC. "This has cemented collaboration, at least to some extent, for every business, large and small."

Keep Reading Show less
Kate Silver

Kate Silver is an award-winning reporter and editor with 15-plus years of journalism experience. Based in Chicago, she specializes in feature and business reporting. Kate's reporting has appeared in the Washington Post, The Chicago Tribune, The Atlantic's CityLab, Atlas Obscura, The Telegraph and many other outlets.

Protocol | China

WeChat promises to stop accessing users’ photo albums amid public outcry

A tech blogger claimed that popular Chinese apps snoop around users' photo libraries, provoking heightened public concerns over privacy.

A survey launched by Sina Tech shows 94% of the some 30,000 responding users said they are not comfortable with apps reading their photo libraries just to allow them to share images faster in chats.

Photo: S3studio via Getty Images

A Chinese tech blogger dropped a bombshell last Friday, claiming on Chinese media that he found that several popular Chinese apps, including the Tencent-owned chat apps WeChat and QQ, as well as the Alibaba-owned ecommerce app Taobao, frequently access iPhone users' photo albums in the background even when those apps are not in use.

The original Weibo post from the tech blogger, using the handle of @Hackl0us, provoked intense debates about user privacy on the Chinese internet and consequently prompted WeChat to announce that it would stop fetching users' photo album data in the background.

Keep Reading Show less
Shen Lu

Shen Lu is a reporter with Protocol | China. Her writing has appeared in Foreign Policy, The New York Times and POLITICO, among other publications. She can be reached at shenlu@protocol.com.

Protocol | Enterprise

As businesses struggle with data, enterprise tech is cleaning up

Enterprise tech's vision of "big data" largely fell flat inside silos. But now, an army of providers think they've figured out the problems. And customers and investors are taking note.

Corporate data tends to settle in silos that makes it harder to understand the bigger picture. Enterprise tech vendors smell a lucrative opportunity.

Photo: Jim Witkowski/Unsplash

Data isn't the new oil; it's the new gold. And in any gold rush, the ones who make the most money in the long run are the tool makers and suppliers.

Enterprise tech vendors have long peddled a vision of corporate America centered around so-called "big data." But there was a big problem: Many of those projects failed to produce a return. An army of new providers think they've finally figured out the problem, and investors and customers are taking note.

Keep Reading Show less
Joe Williams

Joe Williams is a senior reporter at Protocol covering enterprise software, including industry giants like Salesforce, Microsoft, IBM and Oracle. He previously covered emerging technology for Business Insider. Joe can be reached at JWilliams@Protocol.com. To share information confidentially, he can also be contacted on a non-work device via Signal (+1-309-265-6120) or JPW53189@protonmail.com.

Protocol | Policy

What Frances Haugen’s SEC complaint means for the rest of tech

Haugen argues Facebook misled investors by failing to disclose its platforms' harms. If the SEC bites, the rest of tech could be next.

The question is whether the SEC will find the contents of Haugen's complaint relevant to investors' interests.

Photo: Matt McClain-Pool/Getty Images

Whistleblowers like former Facebook staffer Frances Haugen have pretty limited options when it comes to actually seeking redress for the harms they've observed and documented. There's no federal privacy law in the U.S. to speak of, Section 230 protects platforms for online speech and companies like Facebook are under no obligation to share any information with lawmakers, or anyone else, about what's happening on their sites.

But there is one agency that not only governs all publicly-traded companies, including in tech, but also offers whistleblowers like Haugen the opportunity for a payout: the Securities and Exchange Commission.

Keep Reading Show less
Issie Lapowsky

Issie Lapowsky ( @issielapowsky) is Protocol's chief correspondent, covering the intersection of technology, politics, and national affairs. She also oversees Protocol's fellowship program. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University's Center for Publishing on how tech giants have affected publishing.

Latest Stories