People

Moderation can’t wait: The challenges startups like Clubhouse face trying to build community

A former head of moderation at Quora says it's about more than inviting diverse voices to a network: It's about moderating them all to promote inclusion.

Tatiana Estévez

Culture will develop whether you want it to or not, says former Quora head of moderation Tatiana Estévez. That's why apps like Clubhouse need to start moderation and community-building early.

Photo: Courtesy of Tatiana Estévez

In just four months, Clubhouse has attracted users from Oprah Winfrey to the elitist circles of venture capital. The buzzy, still-in-beta app is an audio-only social network where people can host rooms to talk or just drop in for a listen — the conversations range from product growth strategies, to Black Lives Matter, to Jesus and Socrates.

But like any social network, the company isn't immune from bad behavior, even with the caliber of its user base and being still in beta. After an argument on Twitter between the New York Times journalist Taylor Lorenz and venture capitalist Balaji Srinivasan boiled over into a Clubhouse session, the app has faced criticism for not adequately addressing reports of harassment. Not great for an in-beta startup valued at $100 million.

That perhaps explains why Clubhouse revealed on Friday its updated community guidelines along with new tools to block and report specific cases of harassment. The accompanying blog post, however, seems to raise more questions than it answers about how it plans to police its platform and protect users.

To give Clubhouse its due, it only has two full-time employees and is incredibly young. But even just months into its existence is too late for Clubhouse to be thinking about how it is going to address harassment and moderation, said Tatiana Estévez, who previously led moderation and community for Quora.

"It is not something that you can push down the line because in a way it happens anyway," Estévez told Protocol. "Culture will develop whether you want it to or not. Tone will develop whether you want it to or not." She added: "I think moderation and community should be part of the product. It should be seen as an integral part of the product from the onset."

Protocol spoke with Estévez, who is currently working on "something new," after she wrote a widely shared Twitter thread on the challenges Clubhouse and other networks face when it comes to moderation and building an inclusive community — a problem that might only be amplified in a conversational setting where we know that it's easier for men to talk over women.

In an interview with Protocol, Estévez talked about the challenges of audio vs. written moderation, what founders should be considering when building a community, and whether it's possible to even have a safe space for conversation online.

The conversation took place before Clubhouse revealed its updated community guidelines, and Estévez declined to comment on its public statements.

This interview has been edited and condensed for clarity.

I was following the Clubhouse debate, and there were lots of concerns about the talks that were happening on the platform. And as you pointed out in your Twitter thread, audio is a hard medium to moderate. So why did the Clubhouse situation catch your attention?

I had no strong feelings about the actual debate that was happening, but it made me think a lot about the complexities with audio and how difficult it is to track what the actual issue is compared to reading a conversation. We all know in real life conversations what the dynamics are like. When it comes to people you don't know on the internet, we know that the worst of us tends to get amplified. It really made me think about those aspects and what are the possible solutions.

So you had worked at Quora where you were focused more on written moderation. How is audio a different beast to moderate?

So I haven't moderated audio, so this is speculation on my part, but you're not going to think as hard when you're having a conversation in audio. Even in a fast-paced online conversation, if you're writing something, you can pause, you can think, you're going to backtrack, edit. Whereas in conversation, you're saying it as you think. And that's going to be difficult.

You also have another aspect, which doesn't happen as much when you're writing, and that's tone. You could say something perfectly nice that with the right inflection, with the right snark level, it's going to be totally horrible. A few people have said like, could you capture this all in transcripts? But you can't capture that tone. Also, if you're going quickly back and forth, it might not be obvious from what one person is saying that that's actually a reaction to what someone else said a few minutes ago. It takes a lot longer also to listen. It seems like a higher-cost kind of moderation with a lot of quite easy-to-miss stuff.

To your point on tone, though, I feel like that's something in writing that is harder to moderate because in my writing, you don't get to capture my tone. So if I am being sarcastic, it may not come across as sarcasm.

There's loads of nuance in tone. It's a difficult problem because we're used to often talking about the extremes when we talk about problems with online moderation — like the really bad sexist comments or racist comments, and we all agree, that's bad. We want those people off. And that should be a problem that we're really dealing with.

But there's the other issue of the way people interact with each other: people that aren't intending to be assholes, but are making the place pretty hostile. And it's not necessarily obvious.

I've tried to talk about it by framing it into language that is easy. By giving it labels, I find people tend to understand it better. So when you look at the conversation, sometimes it's really not apparent what the issue is. But if I explain "this is gaslighting" or "this is mansplaining" or "this is sealioning" it starts to become clearer that there is a problem there, and it is a hostile environment. And this often happens with men and women interacting, especially men replying to women in a way that really makes women feel super put off with the internet. That is why places aren't as inclusive as they should be, even with the intent of wanting diversity in these spaces.

How should companies that are just getting started, like Clubhouse, be thinking about building a culture that is inclusive from the beginning?

So that's the most important thing — that they are thinking about it from the get-go. It is not something that you can push down the line because in a way it happens anyway: Culture will develop whether you want it to or not; tone will develop whether you want it to or not. So, if you're a founder, you're going to want to think about: What kind of tone do I want this place to be? Do I want this place to be inclusive? Free speech is important, but is it at any cost? What kind of rules and policies do I need to make sure that happen?

It doesn't have to be set in stone. In fact, it shouldn't be set in stone. It should be like, what kind of place do I want this to be? What kind of voices do I want? I could see scenarios where there's a very sort of a specific community for a very specific demographic, where inclusivity might not be a factor for that, but they still need to think about that and make that decision. For example, if it's an LGBTQ+ forum, they don't have to be inclusive to people who are anti-LGBTQ. Right? So you need to think about that from the onset, and then you need to think about who you're inviting in.

You also need to be really aware as [people join], are the voices that you wanted to speak speaking? Or is it the same people that are being highlighted? We've seen this in so many social sites that the default is men tend to speak more than women, and they tend to speak louder, and it's particularly white men. And we know we've seen that over and over again, and it's not an accident. It's not that it just happens to be that men like talking in social apps, there's a reason why [others] are being pushed out.

Do you find audio is a harder platform for this than the written word? I'm wondering if the keyboard is kind of an equalizer, or if you have found that men still dominate a lot of the online written conversations.

I think there's a ton of problems with online conversation, whether it's written or audio. There is something about the written word, which allows women to maybe address [others] with a bit more of equality. With audio, you have a few extra elements, for instance, men interrupting women. That's one of the interesting dynamics.

I don't know if you've ever had this experience, but you're having a meeting, there's several men, and you've been trying to speak out, but you keep being interrupted and men talk over you. It happens less these days, but still, I've experienced that quite a few times. And then I have to speak up because otherwise, I'm not going to get to say what I want to say. So I speak up, and then the men get upset, and say "let me speak, let me finish."

You have this dynamic where women are being interrupted all the time, but men are the ones being sensitive to this happening because they're not used to it. And so I did see this on Clubhouse.

I thought you made an interesting point that it's not just having diverse voices invited into the community, it's making sure that they're actually having a voice within the community. How much of that requires moderation versus proactive assistance on the part of the company?

I'm a community manager as well as a moderator. It has to be a combination. I think someone like me can help set tone a lot and set those values.

You also want the founders to be there setting those values, and it has to be something that's coming from the very top and has to be something that comes from the company. You're right, you need to look at product solutions as well, and that's not my area of expertise. But it's difficult. It's nuanced.

I think it's difficult because very few people look at these things, these conversations the way I do. I know a lot of men that are very sympathetic to this, but they don't see it, they don't hear it, and they don't always hear it when they do it, which is always difficult because they're often really well intentioned. Luckily, I know quite a few men that are willing to be educated on this, and are willing to hear this and accept what I'm saying. And if that happens, then I think then you can start to get those products solutions because when you really start to understand it, then it becomes clearer.

Do you think any company handles moderation well? These days we see Reddit in the news trying to address problems, while Facebook is being audited by civil rights groups who say it's dragging its feet.

I think what happens with a lot of companies is that they are in reactive mode. It's like when it becomes a problem, then they try and solve it. It's very difficult once your community tone is established, a culture is established, to then go and say, "Hey, we don't want this anymore." Because the power of that community is like, "Well, who are you to say we can't talk like this? We've decided that this is the tone." It's part of the identity of that product. So I think that's been the biggest problem with these companies in general.

The reactiveness of the companies?

Yes — they are dealing with it well after it's a problem. I think moderation and community should be part of the product. It should be seen as an integral part of the product from the onset. For every single feature, every single tool, every single addition, everyone should be asking the moderation team and saying: How can this be abused? Is this a vector of abuse? For what it's worth, nearly everything we use is a vector of abuse in the right hand, that's just the world. And it's like, what is the likelihood of that happening? How can we fix it? That's at the product development stage.

You don't think about it once you've had a droves of women or Black people leaving the product, having to go somewhere else. You have to think about [it] from the onset because the damage has been done to a certain extent if you're trying to deal with the result.

What else do you think people need to understand about moderating and creating these inclusive communities, which are more important every day?

I'd really like to stress that the online experience is not the same for everyone. And at the moment, we have a heavy online experience where the default is white men. It's kind of forcing people to adopt the tone of white men, or just go somewhere else and not be part of that. And a lot of women, a lot of people of color, Black people, LGBTQ, do not feel safe in this default white male space. And there's no reason for that. It's not OK anymore that people feel that these spaces are inherently hostile to them.

And you need to talk to these people — women, different marginalized groups, LGBTQ+, people of color, particularly Black people — and you need to ask them: What are the behaviors that are making you not want to be here? What would make this space safe for you? And I know there's some people who dislike that term "safe space." But I feel that's what we all really want. At the end of the day, we want a safe space to talk.

That's a great point. A lot of people hear safe space, and they think, "Oh you're cushioning people and keeping them from the truth" and that's not actually what that means at all.

If people knew me online, you would know that I'm very comfortable talking about demanding subjects and challenging subjects. And I think we can do that in a nice way. I think we can do that by being respectful to each other. I truly believe that it just requires a little bit of empathy for each other.

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep ReadingShow less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep ReadingShow less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep ReadingShow less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep ReadingShow less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep ReadingShow less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins