For weeks, Facebook has been touting the precautions it's taking to prevent interference in this year's presidential election, including a new policy designed to stop the spread of viral misinformation, even before it's been flagged as false by fact-checkers.
Now, that policy is being put to the test.
On Wednesday, Facebook announced it was limiting the spread of a controversial story in the New York Post regarding Vice President Joe Biden's son Hunter, instantly prompting cries of censorship from Republicans, including Missouri Sen. Josh Hawley, and questions from content moderation scholars about the company's rationale.
"While I will intentionally not link to the New York Post, I want [to] be clear that this story is eligible to be fact-checked by Facebook's third-party fact checking partners," Facebook spokesperson Andy Stone tweeted. "In the meantime, we are reducing its distribution on our platform."
Later, Stone directed Protocol to the company's recently announced policy on viral misinformation, which states, "In many countries, including in the U.S., if we have signals that a piece of content is false, we temporarily reduce its distribution pending review by a third-party fact-checker." Stone didn't respond to Protocol's question about what signals it's received in this case.
Just last week, Facebook's vice president of integrity, Guy Rosen, called the viral content review system "an additional safety net" on a call with reporters. "This system helps us catch content that our regular flows may not pick up fast enough," Rosen said.
For Facebook, this is precisely the type of story the company has been both preparing for and dreading: A potentially damaging scoop, obtained through questionable means, with obvious ties to Russian interests. The last thing Facebook wants is a repeat of 2016, when its platform became a primary vector in the spread of documents that were hacked by Russian military and then published on the website DCLeaks. So, lately, Facebook's head of cybersecurity policy, Nathaniel Gleicher, has been warning about such "hack and leak" operations to anyone who will listen, urging media outlets to take caution before taking the bait. Last week, Gleicher also warned about a trend called "perception hacking," in which foreign operatives try to feed disinformation to "unwitting news organizations."
So, what is Facebook to do when a story like the Post's comes along, claiming that a mystery laptop that was dropped off at a Delaware computer repair shop and never picked up contains a "smoking gun email," that, if true, suggests Hunter Biden introduced his father to an executive at the Ukrainian energy firm Burisma? It reads like precisely the type of drill that Facebook (and others) have been running to prepare for this election. And so, here is Facebook putting that preparation to use.
Faced with the option of doing nothing, blocking the story from being shared, and reducing its spread, Facebook picked the third door. Twitter, by contrast, picked the second, preventing people from posting the story at all, citing its "hacked materials policy." (Retweets appeared to still go through.) That decision prompted one Post editor to tweet angrily about what he called a "Big Tech information coup" and a "digital civil war."
Facebook's approach is less heavy-handed compared to Twitter's. But these types of decisions are messy nonetheless. Facebook wrote this policy to address a problem of scale — the fact that all the fact-checkers in the world couldn't make their way through all the misinformation and disinformation on a platform of billions. That has meant that misinformation often spreads far and wide, polluting the information ecosystem, before it's fact-checked. One recent example: a viral conspiracy theory video called "Plandemic" exploded on Facebook before being removed days later.
For most of its history, Facebook has been comfortable keeping an arm's length between itself and these decisions, preferring to offload determinations of truth to third parties. But that approach has become increasingly problematic for Facebook, prompting the company to, in Mark Zuckerberg's own words, "evolve" and begin banning previously allowed content like Holocaust denialism and anti-vaccination propaganda. Even then, Facebook has often credited the third-party experts who guided those decisions.
Rarely, if ever, has the company justified these judgment calls with something as fuzzy as whether or not it has "signals" that something might be misinformation. This is one of those calls based not on a steadfast rule, but on a hunch, and a hope that Facebook can create a little bit of friction between potential misinformation and the masses before its fact-checkers get to do their jobs.
The claims of censorship were to be expected. Shortly after Stone tweeted, Sen. Hawley sent Zuckerberg a letter with a series of questions about the decision. "If you have evidence that this news story contains "disinformation" or have otherwise determined that there are inaccuracies with the reporting, will you disclose them to the public so that they can assess your findings?" Hawley asked. "Why did you endeavor to publicly state that such a story was subject to a fact-check? Isn't such a public intervention itself a reflection of Facebook's assessment of a news report's credibility?"
Hawley wrote that the company's intervention suggests "partiality on the part of Facebook."
But not everyone was so critical of this approach. In a lengthy Twitter thread, Renée DiResta, one of the top scholars on the topic of viral misinformation and a technical research manager at Stanford Internet Observatory, wrote that Facebook's decision on the Biden story is "actually a very good use of the policy levers at its disposal."
"There are tradeoffs: if virality is unfettered & nothing is fact-checked, don't be surprised when wild nonsense trends," DiResta wrote. "Provided that this policy is applied in a viewpoint-agnostic way, it seems to be a very solid middle ground for addressing info threats ahead of 2020 and beyond."
It's still unclear what determination Facebook's fact-checkers will make. In the meantime, while conservatives accuse Facebook of censorship, the Hunter Biden story — and Facebook's treatment of it — is getting plenty of exposure on Fox News.