Halfway through my interview with Brian Fishman, Russian regulators announced they fully blocked access to Facebook inside the country. Fishman wasn’t surprised. As Facebook’s former head of Counterterrorism Policy, he’d been watching his former employer and other tech giants try to “walk a tightrope” in Russia, attempting to keep lines of communication open for Russian people while also minimizing the propaganda and disinformation coming out of the Russian government.
“You're not going to be able to balance for long,” Fishman said.
Fishman knows more than almost anyone about how Facebook weighs the risks and rewards of silencing people — even very bad people — on the platform. In a company that’s more comfortable with wielding soft power through information labels and fact-checks, Fishman was the pointy end of the content moderation spear, the person responsible for maintaining the company’s list of “dangerous individuals and organizations” — the people and groups deemed too awful to be on Facebook or to even be praised by others on Facebook. That includes foreign terrorists like ISIS and its leaders, but also, in recent years, domestic extremists like the Proud Boys and violent conspiracy networks like QAnon.
Fishman said in October he was leaving the company after five years, smack in the midst of The Wall Street Journal’s reporting on whistleblower Frances Haugen’s disclosures — timing Fishman warned his Twitter followers not to read too much into.
The counterterrorism expert and current senior fellow at New America has kept pretty quiet about Meta’s moves ever since. But he spoke with Protocol about the decisions Meta’s making in Russia, how those decisions apply to other global conflicts and why he knew it was time to walk away from Facebook.
This interview has been lightly edited and condensed for clarity.
What do you make of the decisions Meta’s made related to Russia?
I think that Meta and other tech companies are wrestling with a tremendous geopolitical upheaval. What you've seen from the tech companies is a pattern that we've seen before, which is a relatively careful escalation of policies, in which one company will jump forward and then others will maybe jump a little bit further, and that has been a relatively gradual process.
The trick with Ukraine is that this really is a — not an unprecedented situation, but it is an extraordinary situation. Besides the fall of the Soviet Union, this is the biggest geopolitical event of my lifetime. It will have longer and broader impacts than 9/11. There is a fundamental reality that all sorts of actors are sort of feeling their way through this one really carefully, and the tech companies are no different.
One of the things that we've got to be careful of therefore saying is: Well, if [Meta] did something here in this Russian invasion of Ukraine, they must do it in other places. This is, in terms of its global geopolitical impact, there just aren't a lot of direct comparisons.
A lot of folks wrongly said, well, human suffering here is different. That's not true. The human suffering in Syria was extraordinary. Some of the commentary has been pretty, frankly, racist. But it is true that the geopolitical difference is larger than these other kinds of circumstances, and it's going to impact Silicon Valley companies more than some of the other conflicts.
Are there systems that have been developed in other conflict regions that you can see Facebook relying on in this moment?
Facebook is way better at dealing with crises today than they were when I first got there. There are teams that deal with these things. They've got a much deeper bench of people that have dealt with this kind of problem in previous jobs. They haven’t been sitting on their laurels.
But they, like everyone else, don’t really know what a company's job is in this situation. Every Silicon Valley company, their first instinct is: We’ve got to keep the lights on wherever people can talk. I tend to be skeptical of that view, generally. But I think it's right here. Keeping the lights on, keeping the information flowing in Russia in particular right now, is really important to letting people speak as repression increases. They've been trying to walk that tightrope. How do we limit our ability to be abused by an increasingly overtly authoritarian actor while trying to empower everyday people, who are not responsible for the crimes for the government, to speak to each other, to organize, to get information about what's happening?
You're not going to be able to balance for long. The Russian state is clearly on an authoritarian bent. It has the capacity to block reasonably effectively. And so I think that we're going to see that happen. They're going to force people to rely on systems that they think they can control more effectively.
At the same time, we have seen Facebook not operate in China. We've seen Google pull out of China. There are countries in the world where people would benefit from being able to, in an uncensored manner, communicate, and those companies have opted not to operate there. So I wonder: Do you think they're headed toward a similar moment with Russia?
I hope not, because I do think that information flowing freely is invaluable. But I think that if the Russian government puts conditions on their behavior in Russia, they have to consider the China option. If the Russian government says, “You must carry certain information in order to operate here,” or “You must give us access to certain information in order to operate here,” you can go down the list.
Isn't Russia kind of doing that with all these data localization and hostage-taking laws? [Editor’s note: After this interview, Russia also passed a law prohibiting the publication of “fake news” about the military, prompting TikTok to suspend most of its operations there.]
Pre-the invasion, Russia was trying to set the legal table, so that they would have those leverage points. The case with the Navalny app was an example of that. That's the danger of those laws more generally, around the world. They give governments a tool.
The U.S. government, our constitutional system, is clunky as hell sometimes, but it is built around the idea of making it difficult for governments to crack down on people. Many democracies are not built with those kinds of protections strongly embedded in the institutional structure. And that's an important thing to remember, because governments go bad sometimes, because they're made of people. And people make mistakes. They're prone to the dark side. That's why these laws can be dangerous.
I do think there's a difference between Russia trying to pass those laws in a pre-Ukraine invasion situation and the sort of overtly authoritarian bent that we're seeing now. They're threatening to lock up protesters for five years and forcible conscription and these kinds of things. We are effectively moving into a place where their behavior is extrajudicial.
Going back to what you said about companies having systems in place to manage conflict: What are some of those systems?
There are improved processes for centralizing information from across the company and making sure it gets up to leadership. Companies didn't know that they were going to need that stuff. I would argue they should have known sooner than they did, but they didn't. But now a lot of those processes are there. And they exist. And they're staffed by people that know what they're talking about and have backgrounds doing that kind of work in government and elsewhere.
And so, you've got a lot of people — not just at a senior level — that have dealt with geopolitical crises in various forms now in government, and elsewhere. That doesn't mean that every decision winds up being a good one. But it does mean that they've got folks who have been through the wringer with these kinds of situations in the past.
The flip side of that is: Nobody who wasn't working age in 1989 to 1990 has been through something quite like this.
Do you think what Facebook went through in Myanmar is any way instructive for this moment?
I think that when many people look at Facebook's performance in Myanmar, understandably, they focus on the failures. But there's also a series of actions that Facebook has taken in Myanmar that are more aggressive against the government than they've taken anywhere else in the world — and that are more aggressive, I think, than any other social media company has taken against the government in the world. And I think it's fair to ask if those kinds of actions can be taken in other extraordinary circumstances.
If you are going to do that, the bar needs to be really, really high. It’s untenable for companies to go around and smack the hands of governments all the time around the world. It needs to be a tool that they can pull out of their pocket in extraordinary circumstances. Advocates around the world have to understand that.
And what I worry about is companies being concerned about setting a precedent that they will then be asked to use all the time. What we need them to be able to do is set a bar that's really high, and all of us outside understand that that bar is really high, and that we're not necessarily going to have it used for the conflict that we think is particularly important.
Understanding that the bar should be really, really high — and I agree — do you think there was a missed opportunity along the way, post-2016, to do something more about Russian propaganda on the platform, given that that was a very targeted attempt to interfere in the election by a foreign government?
That was not the stuff that I was most directly involved in. I really want to be careful coming out of a place like that, especially when you're in a senior role, you're aware of a bunch of things but you're not in every conversation if it's not the thing that you're working on.
So, I just saw that Russia did block Facebook.
It's not surprising. It's easy to criticize that iterative policy development that the companies do as incremental. But the salami-slicing is a strategy in international relations.
One of the things we criticize the Russians for doing is they take a little bit, and they see what they can get away with. And they take a little bit more, then see what they can get away with. I don't know that was a strategic choice by any of the companies. But I do think as this entire world gets more mature, we're going to start thinking about those sorts of things as strategic choices in the geopolitical stances for companies.
Sometimes, intentionally, you want to go all in. Other times, maybe you want to see what you can do before you elicit a response. My instinct is that this was not intentional by any of the companies in this case. But I do think, over time, as we get used to companies operating as geopolitical actors, these kinds of decisions may get a little bit more structured and more intentional.
Changing topics, you left the company in November. I'm interested in what led to that decision.
I was at Facebook for five and a half years, the longest job I've ever had. I'm not quite sure how that happened. Those jobs are incredibly intense. Bottom line, I got to a point where I didn't feel the same fire internally for some of the fights that you need to have. When you start to feel that it's time to go, it may not show up in your work right away, but it will. And so it's time to go.
I'm not going to get into specific questions. But I think one of the things that gave me comfort in leaving is that when I got to Facebook, the bench of people with backgrounds kind of like mine [was small]. My old team is stronger, personnel-wise, than it’s ever been. There is a broader universe of smart people that have background thinking about national security issues. It was easier for me to walk away feeling like there was a universe of folks that could take on some of those things.
And I won't lie, there were some things that I disagreed with and that I didn't want to do. And then I was frustrated. But to be honest, I don't really want to have a public discussion about it.
You announced you were leaving right around the time Frances Haugen came out with her disclosures, and you tweeted something like, “Just a reminder that correlation does not equal causation.” I’m interested in what you thought of what she brought forward.
I don't know exactly what she brought forward, because a bunch of the leaked stuff is still not available publicly. I did not know Frances Haugen. I don't think I've ever met her. I think that the folks that have access to those documents need to be very careful. Some of them may indicate really careful work. But many of them are going to reflect random people doing analysis on some issue that is close to their heart, and the terminology that they use and the methods that they use may not be indicative of how the organization as a whole measures or defines anything.
Folks need to be real careful with that kind of data as they analyze this. Social media companies aren't the only ones trying to figure out what to do with social media. Activists, governments are all struggling, you know, in their own ways, with similar problems. But a lot of that means: Don't just take stuff immediately at face value. You have to get down to: How are terms defined? Where did the data come from? How was it actually analyzed? And if you can't do that, you ought to be really, really skeptical.