Two weeks after Facebook suspended former President Donald Trump's account indefinitely, Facebook answered a chorus of calls and referred the case to its newly created Oversight Board for review. Now, the board has 90 days to make a call as to whether Trump stays or goes permanently. The board's decision — and more specifically, how and why it arrives at that decision — could have consequences not only for other global leaders on Facebook, but for the future of the Board itself.
Facebook created its Oversight Board for such a time as this — a time when it would face a controversial content moderation decision and might need a gut check. Or a fall guy. There could be no decision more controversial than the one Facebook made on Jan. 7, when it decided to muzzle one of the most powerful people in the world with weeks remaining in his presidency. It stands to reason, then, that Facebook would tap in its newly anointed refs on the Oversight Board both to earnestly review the call and to put a little distance between Facebook and the decision.
It also stands to reason that the Oversight Board would oblige. It was designed to be Facebook's Supreme Court analog, taking up cases that can set new, important precedents for Facebook and issuing decisions on those cases. Like the actual Supreme Court, what matters now is not just what decision the board reaches regarding Trump's account, but how narrowly or broadly it rules. A broad decision that takes into account not just what Trump said on Facebook, but the offline consequences of his words, could mean tougher treatment of all global leaders with a record of exploiting Facebook to achieve violent ends. A narrow one could risk creating a global double standard, affirming fears that Facebook is more concerned with violence on its own home turf than in other countries.
"I actually think this will be the Oversight Board's Marbury v Madison moment," Stanford Law professor Nate Persily wrote on Twitter. "Meaning, even if they uphold the decision to suspend, the way they handle the case, decide on their jurisdiction, and consider the breadth of the issue presented will be important going forward."
But a broader question hangs over this entire experiment, and it is very much an experiment: Does the average Facebook user actually care what the Board has to say?
It's true that the Board's decisions will matter to Facebook, which means they will matter to its billions of users. The decisions it makes will be binding, according to bylaws agreed upon by Facebook, and will likely have domino effects on other content decisions down the line. But if there's a public relations aspect to all of this — that is, if Facebook is hoping to unload the burden of Trump's banishment and redirect some of the public backlash toward a third party — that effort seems doomed to fail.
Unlike the actual Supreme Court, the average American may very well have no idea that the Oversight Board exists. Even if they do, how many of them will take the time to understand the tedious process Facebook underwent to ensure the board is both bipartisan and independent? To the average Trump voter incensed about Facebook's decision to suspend Trump and so many other decisions before that, what is the Oversight Board, really, but an offshoot of the all-powerful tech giant they've believed to be shilling for Democrats all along?
That's what makes this moment so dicey for the Oversight Board. It's taking on one of its most consequential cases before it's even issued a single other decision. "On the one hand, it divests a huge amount of power from [Facebook] to give the Board authority over this. On the other hand, maybe the Board is too nascent to take on such an enormous question," Kate Klonick, an assistant professor at St. John's Law School, who has studied the Oversight Board extensively, tweeted Thursday. "The Board can establish its seriousness and jurisdiction/power over [Facebook]. That could be good for the Board, but it also means that it's very risky for establishing legitimacy [...] Not sending it also would have also been a damning message — that the Board's authority was limited and that [Facebook] didn't really intend to give it any hard questions."
Now, just months after it came into existence, the Board already faces an existential question: to err on the side of public safety or public perception. "Whatever they say will piss off 50-ish percent of Americans. Purely as game theory, I think they're best off reinstating. That shows independence [and] reassures American conservatives, who broadly pose a bigger threat to the Facebook Oversight Board than American liberals," tweeted Daphne Keller, who directs the program on platform regulation at Stanford's Cyber Policy Center.
Keller later added that ruling based on those incentives "would be a dereliction of duty, in terms of what they are actually supposed to do."
But of course, those incentives do exist. The Board may be new, but the choice it's now facing is not. It's a choice between public safety and self-preservation — a choice that will determine the future of dictators and strongmen around the world and, as we've recently seen, have very real implications for the people they govern. It's a choice Facebook has made again and again throughout its history — a choice it's now asking the board to make instead.