Exit interview: Facebook’s former counterterrorism chief talks Meta’s moves in Russia

“They, like everyone else, don’t really know what a company's job is in this situation.”

Brian Fishman  at a table with other social media company representatives

Brian Fishman (center) worked for five years as the counterterrorism chief at Facebook.

Photo: Vincenzo Rando/Kontrolab/LightRocket via Getty Images

Halfway through my interview with Brian Fishman, Russian regulators announced they fully blocked access to Facebook inside the country. Fishman wasn’t surprised. As Facebook’s former head of Counterterrorism Policy, he’d been watching his former employer and other tech giants try to “walk a tightrope” in Russia, attempting to keep lines of communication open for Russian people while also minimizing the propaganda and disinformation coming out of the Russian government.

“You're not going to be able to balance for long,” Fishman said.

Fishman knows more than almost anyone about how Facebook weighs the risks and rewards of silencing people — even very bad people — on the platform. In a company that’s more comfortable with wielding soft power through information labels and fact-checks, Fishman was the pointy end of the content moderation spear, the person responsible for maintaining the company’s list of “dangerous individuals and organizations” — the people and groups deemed too awful to be on Facebook or to even be praised by others on Facebook. That includes foreign terrorists like ISIS and its leaders, but also, in recent years, domestic extremists like the Proud Boys and violent conspiracy networks like QAnon.

Fishman said in October he was leaving the company after five years, smack in the midst of The Wall Street Journal’s reporting on whistleblower Frances Haugen’s disclosures — timing Fishman warned his Twitter followers not to read too much into.

The counterterrorism expert and current senior fellow at New America has kept pretty quiet about Meta’s moves ever since. But he spoke with Protocol about the decisions Meta’s making in Russia, how those decisions apply to other global conflicts and why he knew it was time to walk away from Facebook.

This interview has been lightly edited and condensed for clarity.

What do you make of the decisions Meta’s made related to Russia?

I think that Meta and other tech companies are wrestling with a tremendous geopolitical upheaval. What you've seen from the tech companies is a pattern that we've seen before, which is a relatively careful escalation of policies, in which one company will jump forward and then others will maybe jump a little bit further, and that has been a relatively gradual process.

The trick with Ukraine is that this really is a — not an unprecedented situation, but it is an extraordinary situation. Besides the fall of the Soviet Union, this is the biggest geopolitical event of my lifetime. It will have longer and broader impacts than 9/11. There is a fundamental reality that all sorts of actors are sort of feeling their way through this one really carefully, and the tech companies are no different.

One of the things that we've got to be careful of therefore saying is: Well, if [Meta] did something here in this Russian invasion of Ukraine, they must do it in other places. This is, in terms of its global geopolitical impact, there just aren't a lot of direct comparisons.

A lot of folks wrongly said, well, human suffering here is different. That's not true. The human suffering in Syria was extraordinary. Some of the commentary has been pretty, frankly, racist. But it is true that the geopolitical difference is larger than these other kinds of circumstances, and it's going to impact Silicon Valley companies more than some of the other conflicts.

Are there systems that have been developed in other conflict regions that you can see Facebook relying on in this moment?

Facebook is way better at dealing with crises today than they were when I first got there. There are teams that deal with these things. They've got a much deeper bench of people that have dealt with this kind of problem in previous jobs. They haven’t been sitting on their laurels.

But they, like everyone else, don’t really know what a company's job is in this situation. Every Silicon Valley company, their first instinct is: We’ve got to keep the lights on wherever people can talk. I tend to be skeptical of that view, generally. But I think it's right here. Keeping the lights on, keeping the information flowing in Russia in particular right now, is really important to letting people speak as repression increases. They've been trying to walk that tightrope. How do we limit our ability to be abused by an increasingly overtly authoritarian actor while trying to empower everyday people, who are not responsible for the crimes for the government, to speak to each other, to organize, to get information about what's happening?

You're not going to be able to balance for long. The Russian state is clearly on an authoritarian bent. It has the capacity to block reasonably effectively. And so I think that we're going to see that happen. They're going to force people to rely on systems that they think they can control more effectively.

At the same time, we have seen Facebook not operate in China. We've seen Google pull out of China. There are countries in the world where people would benefit from being able to, in an uncensored manner, communicate, and those companies have opted not to operate there. So I wonder: Do you think they're headed toward a similar moment with Russia?

I hope not, because I do think that information flowing freely is invaluable. But I think that if the Russian government puts conditions on their behavior in Russia, they have to consider the China option. If the Russian government says, “You must carry certain information in order to operate here,” or “You must give us access to certain information in order to operate here,” you can go down the list.

Isn't Russia kind of doing that with all these data localization and hostage-taking laws? [Editor’s note: After this interview, Russia also passed a law prohibiting the publication of “fake news” about the military, prompting TikTok to suspend most of its operations there.]

Pre-the invasion, Russia was trying to set the legal table, so that they would have those leverage points. The case with the Navalny app was an example of that. That's the danger of those laws more generally, around the world. They give governments a tool.

The U.S. government, our constitutional system, is clunky as hell sometimes, but it is built around the idea of making it difficult for governments to crack down on people. Many democracies are not built with those kinds of protections strongly embedded in the institutional structure. And that's an important thing to remember, because governments go bad sometimes, because they're made of people. And people make mistakes. They're prone to the dark side. That's why these laws can be dangerous.

I do think there's a difference between Russia trying to pass those laws in a pre-Ukraine invasion situation and the sort of overtly authoritarian bent that we're seeing now. They're threatening to lock up protesters for five years and forcible conscription and these kinds of things. We are effectively moving into a place where their behavior is extrajudicial.

Going back to what you said about companies having systems in place to manage conflict: What are some of those systems?

There are improved processes for centralizing information from across the company and making sure it gets up to leadership. Companies didn't know that they were going to need that stuff. I would argue they should have known sooner than they did, but they didn't. But now a lot of those processes are there. And they exist. And they're staffed by people that know what they're talking about and have backgrounds doing that kind of work in government and elsewhere.

And so, you've got a lot of people — not just at a senior level — that have dealt with geopolitical crises in various forms now in government, and elsewhere. That doesn't mean that every decision winds up being a good one. But it does mean that they've got folks who have been through the wringer with these kinds of situations in the past.

The flip side of that is: Nobody who wasn't working age in 1989 to 1990 has been through something quite like this.

Do you think what Facebook went through in Myanmar is any way instructive for this moment?

I think that when many people look at Facebook's performance in Myanmar, understandably, they focus on the failures. But there's also a series of actions that Facebook has taken in Myanmar that are more aggressive against the government than they've taken anywhere else in the world — and that are more aggressive, I think, than any other social media company has taken against the government in the world. And I think it's fair to ask if those kinds of actions can be taken in other extraordinary circumstances.

If you are going to do that, the bar needs to be really, really high. It’s untenable for companies to go around and smack the hands of governments all the time around the world. It needs to be a tool that they can pull out of their pocket in extraordinary circumstances. Advocates around the world have to understand that.

And what I worry about is companies being concerned about setting a precedent that they will then be asked to use all the time. What we need them to be able to do is set a bar that's really high, and all of us outside understand that that bar is really high, and that we're not necessarily going to have it used for the conflict that we think is particularly important.

Understanding that the bar should be really, really high — and I agree — do you think there was a missed opportunity along the way, post-2016, to do something more about Russian propaganda on the platform, given that that was a very targeted attempt to interfere in the election by a foreign government?

That was not the stuff that I was most directly involved in. I really want to be careful coming out of a place like that, especially when you're in a senior role, you're aware of a bunch of things but you're not in every conversation if it's not the thing that you're working on.

So, I just saw that Russia did block Facebook.

It's not surprising. It's easy to criticize that iterative policy development that the companies do as incremental. But the salami-slicing is a strategy in international relations.

One of the things we criticize the Russians for doing is they take a little bit, and they see what they can get away with. And they take a little bit more, then see what they can get away with. I don't know that was a strategic choice by any of the companies. But I do think as this entire world gets more mature, we're going to start thinking about those sorts of things as strategic choices in the geopolitical stances for companies.

Sometimes, intentionally, you want to go all in. Other times, maybe you want to see what you can do before you elicit a response. My instinct is that this was not intentional by any of the companies in this case. But I do think, over time, as we get used to companies operating as geopolitical actors, these kinds of decisions may get a little bit more structured and more intentional.

Changing topics, you left the company in November. I'm interested in what led to that decision.

I was at Facebook for five and a half years, the longest job I've ever had. I'm not quite sure how that happened. Those jobs are incredibly intense. Bottom line, I got to a point where I didn't feel the same fire internally for some of the fights that you need to have. When you start to feel that it's time to go, it may not show up in your work right away, but it will. And so it's time to go.

I'm not going to get into specific questions. But I think one of the things that gave me comfort in leaving is that when I got to Facebook, the bench of people with backgrounds kind of like mine [was small]. My old team is stronger, personnel-wise, than it’s ever been. There is a broader universe of smart people that have background thinking about national security issues. It was easier for me to walk away feeling like there was a universe of folks that could take on some of those things.

And I won't lie, there were some things that I disagreed with and that I didn't want to do. And then I was frustrated. But to be honest, I don't really want to have a public discussion about it.

You announced you were leaving right around the time Frances Haugen came out with her disclosures, and you tweeted something like, “Just a reminder that correlation does not equal causation.” I’m interested in what you thought of what she brought forward.

I don't know exactly what she brought forward, because a bunch of the leaked stuff is still not available publicly. I did not know Frances Haugen. I don't think I've ever met her. I think that the folks that have access to those documents need to be very careful. Some of them may indicate really careful work. But many of them are going to reflect random people doing analysis on some issue that is close to their heart, and the terminology that they use and the methods that they use may not be indicative of how the organization as a whole measures or defines anything.

Folks need to be real careful with that kind of data as they analyze this. Social media companies aren't the only ones trying to figure out what to do with social media. Activists, governments are all struggling, you know, in their own ways, with similar problems. But a lot of that means: Don't just take stuff immediately at face value. You have to get down to: How are terms defined? Where did the data come from? How was it actually analyzed? And if you can't do that, you ought to be really, really skeptical.


Binance’s co-founder could remake its crypto deal-making

Yi He is overseeing a $7.5 billion portfolio, with more investments to come, making her one of the most powerful investors in the industry.

Binance co-founder Yi He will oversee $7.5 billion in assets.

Photo: Binance

Binance co-founder Yi He isn’t as well known as the crypto giant’s colorful and controversial CEO, Changpeng “CZ” Zhao.

That could soon change. The 35-year-old executive is taking on a new, higher-profile role at the world’s largest crypto exchange as head of Binance Labs, the company’s venture capital arm. With $7.5 billion in assets to oversee, that instantly makes her one of the most powerful VC investors in crypto.

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.

Sponsored Content

How cybercrime is going small time

Blockbuster hacks are no longer the norm – causing problems for companies trying to track down small-scale crime

Cybercrime is often thought of on a relatively large scale. Massive breaches lead to painful financial losses, bankrupting companies and causing untold embarrassment, splashed across the front pages of news websites worldwide. That’s unsurprising: cyber events typically cost businesses around $200,000, according to cybersecurity firm the Cyentia Institute. One in 10 of those victims suffer losses of more than $20 million, with some reaching $100 million or more.

That’s big money – but there’s plenty of loot out there for cybercriminals willing to aim lower. In 2021, the Internet Crime Complaint Center (IC3) received 847,376 complaints – reports by cybercrime victims – totaling losses of $6.9 billion. Averaged out, each victim lost $8,143.

Keep Reading Show less
Chris Stokel-Walker

Chris Stokel-Walker is a freelance technology and culture journalist and author of "YouTubers: How YouTube Shook Up TV and Created a New Generation of Stars." His work has been published in The New York Times, The Guardian and Wired.


Trump ordered social media visa screening. Biden's defending it.

The Knight First Amendment Institute just lost a battle to force the Biden administration to provide a report on the collection of social media handles from millions of visa applicants every year.

Visa applicants have to give up any of their social media handles from the past five years.

Photo: belterz/Getty Images

Would you feel comfortable if a U.S. immigration official reviewed all that you post on Facebook, Reddit, Snapchat, Twitter or even YouTube? Would it change what you decide to post or whom you talk to online? Perhaps you’ve said something critical of the U.S. government. Perhaps you’ve jokingly threatened to whack someone.

If you’ve applied for a U.S. visa, there’s a chance your online missives have been subjected to this kind of scrutiny, all in the name of keeping America safe. But three years after the Trump administration ordered enhanced vetting of visa applications, the Biden White House has not only continued the program, but is defending it — despite refusing to say if it’s had any impact.

Keep Reading Show less
Anna Kramer

Anna Kramer is a reporter at Protocol (Twitter: @ anna_c_kramer, email: akramer@protocol.com), where she writes about labor and workplace issues. Prior to joining the team, she covered tech and small business for the San Francisco Chronicle and privacy for Bloomberg Law. She is a recent graduate of Brown University, where she studied International Relations and Arabic and wrote her senior thesis about surveillance tools and technological development in the Middle East.


The US plans to block sales of older chipmaking tech to China

The Biden administration will attempt to roll back China’s chipmaking abilities by blocking tools that make a widely used type of transistor other chipmakers have employed for years.

By using a specific, fundamental building block of chip design as the basis for the overall policy, the White House hopes to both tighten existing controls and avoid the pitfalls around trying to block a generation of manufacturing technology.

Illustration: Christopher T. Fong/Protocol

The Biden administration has for several months been working to tighten its grip on U.S. exports of technology that China needs to make advanced chips, with the goals of both hurting China’s current manufacturing ability and also blocking its future access to next-generation capabilities.

According to two people familiar with the administration’s plans, President Joe Biden’s approach is based around choking off access to the tools, software and support mechanisms necessary to manufacture a specific type of technology that is one of the fundamental building blocks of modern microchips: the transistor.

Keep Reading Show less
Max A. Cherney

Max A. Cherney is a senior reporter at Protocol covering the semiconductor industry. He has worked for Barron's magazine as a Technology Reporter, and its sister site MarketWatch. He is based in San Francisco.


Netflix Games had its best month yet. Here's what's next.

A closer look at the company’s nascent gaming initiative suggests big plans that could involve cloud gaming and more.

Netflix’s acquisitions in the gaming space, and clues found in a number of job listings, suggest it has big plans.

Illustration: Christopher T. Fong/Protocol

Netflix’s foray into gaming is dead on arrival — at least according to the latest headlines about the company’s first few mobile games.

“Less than 1 percent of Netflix’s subscribers are playing its games,” declared Engadget recently. The article was referencing data from app analytics company Apptopia, which estimated that on any given day, only around 1.7 million people were playing Netflix’s mobile games on average.

Keep Reading Show less
Janko Roettgers

Janko Roettgers (@jank0) is a senior reporter at Protocol, reporting on the shifting power dynamics between tech, media, and entertainment, including the impact of new technologies. Previously, Janko was Variety's first-ever technology writer in San Francisco, where he covered big tech and emerging technologies. He has reported for Gigaom, Frankfurter Rundschau, Berliner Zeitung, and ORF, among others. He has written three books on consumer cord-cutting and online music and co-edited an anthology on internet subcultures. He lives with his family in Oakland.

Latest Stories