Following a global search more than a year in the making, on Wednesday, Facebook unveiled the first 20 members of its long-awaited oversight board, which will act as a sort of Supreme Court for Facebook's content decisions. The cohort was meticulously selected with an eye toward representing a cross-section of world cultures — in an attempt to answer Facebook's critics on all sides.
The members, including four co-chairs, have collectively lived in 27 countries and speak 29 languages among them. They criss-cross the ideological spectrum, from former U.S. Federal Circuit Court Judge Michael McConnell, who was once considered by President George W. Bush for the actual Supreme Court, to Helle Thorning-Schmidt, the former social-democrat prime minister of Denmark. McConnell and Schmidt will co-chair the board, along with Catalina Botero Marino, who is a former special rapporteur on freedom of expression for the Inter-American Commission on Human Rights, and Jamal Greene, a professor of law at Columbia Law School who focuses on constitutional law and the First Amendment.
In a call with reporters Wednesday, the board's co-chairs acknowledged the sheer difficulty — if not impossibility — of the task ahead of them. "We're going to be having to select just a few flowers, or maybe they're weeds, from a field of possibilities," McConnell said of the case-selection process. "No one will be satisfied with the decisions that the oversight board makes in every case. We won't please everyone."
Facebook sourced the board chairs through a global consultation process and established an independent legal trust to fund the board going forward. The company then worked with the chairs to select the first group of members. Some of them are noted Facebook critics themselves.
"Social media can spread speech that is hateful, deceitful and harmful. And until now, some of the most difficult decisions around content have been made by Facebook, and you could say, ultimately, by Mark Zuckerberg," Schmidt said on the call. "That's why I feel that this is a huge step for the global community that Facebook has decided to change that."
Right now, a quarter of the members are from the United States, but it will be up to the board to appoint the remaining 20. The board will also have the freedom to decide which cases to take up, based both on recommendations from Facebook and user appeals. Whatever decision the board makes about whether Facebook should reinstate a given post or not will be binding.
This grand experiment in oversight on the internet is slated to begin this fall, and it will undoubtedly draw yet more scrutiny to the tech giant.
Protocol spoke with the Facebook team that led the search for the board about who they picked, the impossible task of choosing a 40-person cohort that truly represents the world, and how the board might some day spawn similar oversight bodies for tech companies across the internet.
This interview has been edited and condensed for clarity.
Walk me through the last few months trying to build the board's founding membership team.
Brent Harris (director of governance and strategic initiatives): When we started on this journey and said Facebook can't make all these decisions on its own, we also felt like we didn't have all the answers on how to build this board. We engaged in this consultation process and gathered feedback and heard from stakeholders and critics on what they thought we should build. That resulted in the charter and the bylaws. Part of that consultation has also been an opportunity for us to see what the job is and learn what it is like to take these hard decisions to these people and ask them to deliberate in a panel. That's been an opportunity for us to both learn the job and see hundreds of people in action and taking this on.
Fariba Yassaee (manager of governance and strategic initiatives): It was like putting together pieces of a puzzle. We have so many qualified candidates, who had been sourced and who had been vetted and who had been interviewed. People from around the world who were highly qualified to serve. At the end of the day, we were trying to achieve as much diversity for a small group, a small cohort, as we could for this first round.
Tell me about the makeup of the team.
Yassaee: We've got five people from the U.S., two of them from Latin America, four from Europe, two from sub-Saharan Africa, two from Middle East North Africa, two from Central South Asia, and three from Asia Pacific Oceania. Collectively the members have lived in over 27 countries. They speak at least 29 languages among them. All of them have expertise or experience advocating for human rights. Eight work in non profits. Two of them have previously served as special rapporteurs in key areas of interest. Three of them are former judges of national or international courts. Six of them are current or former full-time journalists. Two of them are well-versed in computer programming languages. Two are members of the American Academy of Arts and Sciences. One is a Nobel Peace prize winner. One led a newspaper to win its first Pulitzer Prize.
I'm sure being on the board is an honor for a lot of people. But it's also so much pressure. You know more than anyone the kind of personal attacks that Facebook employees can face about content decisions. I wonder whether you felt any resistance from people who don't want to be on the hook for those decisions?
Yassaee: In all honesty, no. We have people on this board who have been quite critical of Facebook and quite critical of social media. There are many of them, and that will continue. At least from all the members I spoke to, there was initially, perhaps, some curiosity around the intent. As they consulted with us over the last several months and years, they came to see this was a legitimate institution we were looking to stand up and be separate from Facebook and answer to users and not the company.
You obviously want the board to be diverse, but you could never build a board that really represents all of the cultures and viewpoints that exist on Facebook. So what did you prioritize?
Yassaee: A lot of it came to us throughout the consultation process. Within the U.S., most people commented on needing to see a diversity of political and social viewpoints. Within sub-Saharan Africa, they pointed out they wanted to see a francophone member; they pointed out they wanted to see the four regions within sub-Saharan Africa covered. Throughout the Middle East, there was talk of Israel, Palestine, Gulf countries.
We got a lot of input along the lines of professions. Since this is something people in the beginning were talking about as a court, although it's not a court there was a conversation around how many members should be lawyers? How many should be journalists? We wanted to incorporate that professional diversity. It was also key we had gender balance on this board. We also heard we wanted to see linguistic diversity. Latin America is not just Spanish. It was important we had Portuguese represented as well.
Hopefully people will be pleased with what they see so far. But there's more work to be done.
What happens if the board's decision really fundamentally contradicts something that, say, Mark Zuckerberg feels strongly about? For instance, Facebook's response to COVID propaganda has been pretty firm, and Mark said that's because when you're dealing with a global pandemic, it's a lot more black and white than, say, politics. What if the board members don't see it that way?
Heather Moore (manager of governance and strategic initiatives): That's the reality of oversight. We have been working on this for a year and half, and as much as we've been consulting externally, we've been really working with the teams internally to gather and garner alignment on that point. This board may make decisions we fundamentally, strongly disagree with. That's why you see in the charter and the bylaws that the board has binding authority. We worked really extensively internally to get that commitment, and that's what we've codified.
The board does have to apply Facebook's community standards, and ultimately, its values, especially looking to human rights norms and standards. It has quite a task ahead of it. But given the profiles of individuals we've picked on the board and their geographic and intellectual diversity, I think we all feel pretty good about them making the final decision, irrespective of whether or not we disagree with it. Ultimately these are decisions for users and not for us as employees.
Who will make the decision to keep this going or not?
Yassaee: We made a commitment up front to $130 million. That is a commitment that will stay. But the trust will issue reports on a yearly basis, so we start the conversation early about putting more money on top of that $130 million, that will lead it on the path of endowment. Facebook can't make a decision to call the board off. They're a separate legal entity, and the trust allows for more than just Facebook to contribute funding.
Harris: That was conscious on our part in building the trust. It has been built in a way that it can go beyond Facebook and go to more parts of the industry.
In other words, it could be applied to other companies?
Harris: That is certainly a possibility. We've built it in a way where other companies can choose to join.
Yassaee: That's actually something we've heard during the consultations: that this should be an industry body, not a Facebook body. You never know where the board might take this in the future.
I'm sure your tech industry brethren just love the idea of Facebook setting up an oversight board for them. Have you heard from other companies that they're interested in being overseen by this?
Harris: We have seen companies participate and leaders within the industry participate in different parts of the consultation. They've come to talks we've held. They've watched this closely. I anticipate this is where the industry is headed. We're already seeing more stakeholders call for this, more experts call for this. I think regulators increasingly will call for the forms of oversight and transparency and user appeals we're building.
Can you say more about what other tech leaders have participated?
Harris: I don't think it would be appropriate for me to do so, but I don't think it's hard to imagine who has participated. We've had interest, phone calls, participation from almost every part of the industry, watching how this is being built, at times also sharing their own perspective and feedback about what they think we're doing right and what's potentially off and what makes them nervous.
Yassaee: It's not just the big tech companies, either. There are a lot of smaller tech companies out there who don't have the resourcing to make content and policy decisions the way some of the bigger ones do.
What will be the measures of whether this board is working?
Moore: A couple of the things the report will look to and speak to are: Is the board taking cases? Is the board issuing decisions? It's really about the operational pieces and functional pieces, not the subjective pieces. Are they continuing to source and search for new members and bringing them through the trustee confirmation process? It's really making sure the board is operating and nothing more.
What are some questions you're most eager for the board to answer?
Yassaee: I live in D.C., so politics, politics, politics. A lot that we've heard about throughout this process is: What is the board going to do on political ads? Is that really of importance to the entire board or just the Americans? I'm curious to see how that plays out.