Facebook's audacious experiment in corporate governance inched closer to reality Wednesday with the announcement of the first 20 members of its oversight board. But while much attention has been paid to how the board stands to rewrite Facebook's own rules, an equally important question is how it stands to rewrite the rules for every other tech platform, too.
The board's founding members include a former prime minister of Denmark, a Yemeni Nobel Peace Prize laureate, and a former federal judge nominated by President George W. Bush. If all goes according to plan, this Supreme Court-style body will be up and running by the fall, hearing cases and issuing decisions on what content should or shouldn't be removed from the world's largest social network.
It's a bold idea for Facebook. But the board isn't just for Facebook. In designing this new organization, Facebook's leaders deliberately structured it so that it could have a life beyond the company. To do that, they formed a separate legal trust with an initial $130 million investment from Facebook. But they also empowered that trust to both accept funding from sources outside Facebook and to form companies of its own. That structure would ensure Facebook CEO Mark Zuckerberg couldn't just shut down the board if he didn't like its decisions. But it also opens up the possibility that the trust might some day spin off additional oversight boards for, say, YouTube, Twitter or any other platform that makes content moderation decisions.
"That was conscious on our part in building the trust," Brent Harris, Facebook's director of governance and strategic initiatives, told Protocol. "It has been built in a way that it can go beyond Facebook and go to more parts of the industry."
"This is a big, big mission," Helle Thorning-Schmidt, Denmark's first female prime minister and one of the board's four co-chairs, said on a call with reporters Wednesday. "We are basically building a new model for platform governance."
Over the last year, as Facebook sketched out the contours of the board in a series of global workshops, Harris said he and his team spoke with leaders from "almost every part of the industry."
"[They're] watching how this is being built, at times also sharing their own perspective and feedback about what they think we're doing right and what's potentially off and what makes them nervous," Harris said. He declined to name any particular tech companies that have been involved in these discussions, but said, "I don't think it's hard to imagine who has participated."
Twitter declined to comment for this story. YouTube spokesperson Alex Joseph said the company's policy team has "ongoing conversations with their counterparts at other major tech companies on a variety of topics, including Facebook's oversight board."
There are potential benefits for companies who might choose to join in, said Kate Klonick, assistant professor of law at St. John's University, who has been studying the creation of Facebook's oversight board. For one thing, Facebook has had to pour tremendous time and resources into creating the oversight board and the trust — resources that few other companies have or would choose to spend on their own. At the same time, Klonick said, the way those same companies make content moderation decisions is becoming increasingly untenable, opening them up to liability at a time when lawmakers in the U.S. and around the world are threatening to chip away at tech platforms' legal protections.
"I think it's unquestionably something other companies could benefit from, if they do it in the correct way," Klonick said. "You could start having a universally more transparent, more procedurally correct, more accountable system for online speech across platforms, instead of this ad hoc frankly bullshit we've had for the last 25 years online."
Of course, there are plenty of ways this type of cross-industry collaboration could go wrong, too. Few free speech advocates, Klonick included, want to see a world in which every company adheres to the decisions from one 40-member oversight board.
"If this becomes a mechanism to move more and more of the internet toward one single set of rules, that's a real loss," said Daphne Keller, platform regulation director of Stanford's Cyber Policy Center.
For Facebook, on the other hand, the upside of having more platforms adopt this structure is obvious, Klonick said. "If more people are doing this, it makes it more likely their experiment succeeds, and this is a long-term solution," she said, adding that it's in Facebook's best interest to make it easy for other companies to stand up similar entities. "If they can make it more plug and play for Twitter, that increases the likelihood that Twitter's going to do it."
Opening the door to other funders and participants could also help combat the idea that the board members are bought and paid for by Facebook. "Facebook wins by having the board be independent and legitimate and respected," Keller said. "If it winds up seeming like it's just doing Facebook's bidding, the whole effort would be a waste of money for them."
On a call with reporters, Catalina Botero Marino, a board co-chair and former special rapporteur for freedom of expression for the Inter-American Commission on Human Rights, said she and her colleagues will be bound to uphold a "duty of ingratitude" toward Facebook.
"I am convinced that the best way to maintain the architecture of the internet, and its immense democratizing potential and prevent the adoption of harmful regulations by states is for companies, in particular for the major platforms, to self regulate," Marino said.
That said, the board's director of administration, Thomas Hughes, said on the call that the board's "primary focus" for the time being is Facebook.
Governments around the world are already considering imposing similar oversight models on tech platforms. In Ireland, where Facebook, Google, Twitter and other tech giants have their European headquarters, Keller noted that regulators have proposed requiring video platforms to retain "independent decision-makers" who can hear user appeals on content moderation decisions.
Both Keller and Klonick worry that after seeing what Facebook has created, governments could begin requiring similar structures at other companies, before the model has even had a chance to prove itself. "One of my main concerns is that lawmakers who are desperate for a quick fix will say, 'Aha! The quick fix has arrived,' and make something like this mandatory before we see how well it works," Keller said.
For Facebook, creating an independent, but still self-styled oversight body is undoubtedly a hedge against just that kind of mandate. The board does have the power to appoint new members and override individual content moderation decisions Facebook makes. But ultimately, its members are bound to uphold the content standards that Facebook created, just as the real U.S. Supreme Court is bound to uphold the Constitution. The board can also make recommendations for how Facebook might change its content standards going forward, but Facebook still has a lot of say over whether to adopt those recommendations. That could make the oversight board structure an attractive option for other companies facing down more heavy-handed regulation around the world.
"Part of what's going on here is Facebook's trying to get ahead and self-regulate to avoid top-down regulation," Klonick said. "They might do a better job."