Stop me if this sounds familiar: Mark Zuckerberg holds an all-hands meeting, and gets questioned on a corporate policy that many of his employees don't like. Zuckerberg reminds the staff that Facebook believes in free speech, and is not interested in being "an arbiter of truth." Eventually, he reminds his team: "This is not a democracy."
That story comes from last fall, when the topic of discussion was Facebook's allowing of, and refusal to fact-check, political ads. Now, months later, Facebook's back in a similar situation.
Zuckerberg took a more conciliatory tone on an all-hands call on Tuesday. The meeting was higher-stakes than usual, given the events of recent days. After Facebook declined to take action on President Trump's post saying "when the looting starts, the shooting starts," hundreds of employees protested the company's decision by simply declining to work. "People are in a lot of pain right now," one employee told me.
At least one engineer already publicly quit over the decision. Others have said they're turning down jobs they'd been pursuing with the company. And employees from around the company have criticized Facebook, and Zuckerberg himself. (That they're doing it on Twitter, you have to figure, probably makes it feel even worse.)
On Tuesday, Zuckerberg spoke on a video chat to about 25,000 Facebook employees, Vox reported. You can picture it: Zuckerberg, slightly too close to the camera and slightly overlit as he always is on these streams, in front of that light-wood wall.
First he reiterated his explanation for not taking down Trump's post. He said he was troubled by the post and agonized over the decision, but ultimately decided that what Trump posted had "no history of being read as a dog whistle for vigilante supporters to take justice into their own hands." Since it didn't clearly and imminently incite violence, he said, the post stayed online.
The explanation didn't seem to make anyone feel better — just as Zuckerberg's call with Trump and another with civil-rights leaders didn't seem to help. Zuckerberg wants his team to understand that he's just following the Facebook rules, but his team is shouting back that he needs to change those rules.
Like Zuckerberg has said, it's not a democracy. With just under 58% of the company's voting shares, the CEO is impossible to overrule. (And amid all this turmoil, Facebook shares are up slightly, so it's not like investors want his head.) Facebook's vaunted Oversight Board hasn't started working yet, and when it does it won't be able to get involved in issues like this. As Zuckerberg goes, so goes Facebook.
There does appear to be something of a moderation inner circle, though. Answering a question about who helped make the decision on Trump's post, Zuckerberg said that he, Sheryl Sandberg, controversial policy VP Joel Kaplan and head of diversity Maxine Williams (who was also the only black person Zuckerberg said he consulted) were all on his list, along with a couple of others who he didn't name. Notably missing? Guy Rosen, Facebook's head of integrity — the same job title Yoel Roth has at Twitter. Roth became the target of harassment after Twitter took its action on Trump's posts last week.
Still, there may be hope for employees seeking change. Bloomberg reported, citing anonymous sources from inside Facebook, that the company is already planning two new initiatives: a central hub where users can find election-related information, not unlike the one it built for COVID-19; and new initiatives for promoting racial justice.
Perhaps more interesting for employees railing against the decision about Trump's post, Facebook's also thinking about new ways to police content. Zuckerberg has always held that there are only two ways to handle moderation: leave content up, or take it down. But on Tuesday he said he was interested in exploring non-binary options, like a way to flag a violating post without removing it entirely. That's what Twitter has done a number of times now, to Trump and others, and many Facebook employees have said they liked that solution. He also said he's looking seriously at changing the overall moderation policies.
Will anything actually change? That's hard to know. Facebook has proven unusually good at weathering storms, even seemingly disastrous ones — just remember Cambridge Analytica. And on Tuesday, The New York Times pointed out, Zuckerberg echoed the things he's said through all those other crises — saying that "the net impact of the different things we're doing in the world is positive. I really believe it is."
It's not hard to believe Zuckerberg really means that. The fundamental goodness of connecting everyone has been a core belief of his — and, by extension, of Facebook's — since the company was founded. The good always seemed to outweigh the bad, the bad could always be seen as a small percentage of the content, and it was created by an even smaller percentage of users. But this time, critics say the bad is coming from the President, and those small percentages now register differently.
Unfortunately for Zuckerberg, there's only one person who can do something about it. Because Facebook is many things, but it's definitely not a democracy.
Correction: An earlier version of this article misstated the surname of Facebook's head of diversity. It is Williams, not Waters. Updated June 3, 2020.