Facebook on Friday announced it would limit former President Donald Trump's suspension to two years after its semi-independent Oversight Board criticized the indefinite nature of the ban the company imposed in the wake of the Jan. 6 riot at the U.S. Capitol.
In a company blog post, Nick Clegg, Facebook's vice president of global affairs, said the new suspension period is for actions that "constituted a severe violation of our rules," adding that the suspension could be extended if "there is still a serious risk to public safety" such as instances of violence or other civil unrest when it is set to expire.
The two-year period began with the original suspension, on Jan. 7, and would theoretically end in time for Trump to launch another run at the White House in 2024 — a possibility that loomed over the initial ban and the board decision to review it.
Trump, who encouraged violent supporters attempting to overturn his loss in the election, could also face a permanent ban under a "a strict set of rapidly escalating sanctions" that will come into force if he returns.
Facebook announced the new measures on Trump's account as its response to the board's concerns about its initial decision on Trump. The board upheld Trump's suspension at the time, but said a permanent ban was inappropriate. It also recommended changes to the way Facebook handles content, particularly from prominent, influential and political accounts and pages.
The board called on Facebook to "address widespread confusion about how decisions relating to influential users are made." During the board's deliberations about Trump, Facebook said it had not applied its so-called "newsworthiness" exception, which was reportedly created in part in response to Trump's own incendiary remarks. In considering the Trump case, the board took issue with Facebook's policy, arguing the company shouldn't have special rules for certain influential users.
As part of Friday's response, Facebook said it would begin publishing notices when it leaves up content that violates its policies because of the newsworthiness of the post. The company said it "will remove content if the risk of harm outweighs the public interest," and that the "newsworthiness" factor will apply equally regardless of whether the poster is a political figure. The pronouncements of government officials do tend to be more newsworthy in practice.
The decisions on newsworthiness could begin to remake the relationship between Facebook and world leaders. The latter have relied on social media for governing messages and campaigns, including sometimes threatening or hateful posts. The risk of social media regulation in the U.S., for instance, has increased thanks to Republican lawmakers who already say they're being silenced by Facebook and other tech giants.
"We know today's decision will be criticized by many people on opposing sides of the political divide," Clegg wrote, "but our job is to make a decision in as proportionate, fair and transparent a way as possible."
Trump, who was also removed from Twitter, has sought other ways to get his message out to the public, including through a short-lived blog that his team both unveiled and removed in May. A senior aide, Jason Miller, suggested Trump was working on other forms of online presence.
Several of the board's other recommendations had to do with bringing more transparency to the company's content moderation decisions. Board members asked Facebook to publicly explain the rules regarding account suspensions, to clarify its system of using strikes and other penalties before taking action on an account and to inform users of the number of strikes on their accounts.
Facebook also said on Friday that in the future, it would publish the criteria for the strike system it uses to punish accounts that have violated its rules. That way, users will "know what actions our systems will take if they violate our policies." Facebook said that whether it applies strikes to users would depend "on the severity of the content, the context in which it was shared and when it was posted," with all strikes expiring after a year.
This could be a particularly meaningful change, and not just for Facebook's more famous users. As the Wall Street Journal recently reported, right now, users who land in "Facebook jail" often have little recourse to get out of it, or indeed an understanding of why they got suspended in the first place. That's an issue for influential accounts, but it's even more challenging for regular users, whose account suspensions rarely receive the high level of attention that Trump's has.
Facebook said it had fully implemented 15 of the 19 recommendations from the board, but it sidestepped one of the most pointed proposals: The board had requested that Facebook undertake a comprehensive review of its role in exacerbating political tensions leading up to the Jan. 6 riot. Facebook said of the matter that it would "continue to cooperate with law enforcement and any US government investigations related to the events" and expand its research into the effect of Facebook and Instagram on elections.
"Ultimately, though, we believe that independent researchers and our democratically elected officials are best positioned to complete an objective review of these events," Facebook said in an extended response. The company is now extending its existing research project with outside academics, giving them access to data through February 2021.
Facebook also said it was assessing the feasibility of a separate board recommendation regarding its quarterly transparency reports. Right now, Facebook only publishes information on how much content it removes for policy violations, but the board also asked Facebook to include data about profile, page, and account restrictions, as well as where those restrictions are taking place. The company said some of the data, particularly on location, might be unreliable with malicious actors.
The board said in a tweet that it was reviewing Facebook's updates and would "offer further comment once this review is complete."