yesIssie LapowskyNone
×

Get access to Protocol

I’ve already subscribed

Will be used in accordance with our Privacy Policy

Power

Facebook’s viral misinformation policy gets put to the test with Hunter Biden story

The company has limited the spread of the story, citing a policy to curb viral misinformation, even before it's been fact-checked.

Mark Zuckerberg standing in front of his own face on a screen

For most of its history, Facebook has been comfortable keeping an arm's length between itself and these decisions. Now the company is beginning to, in Mark Zuckerberg's own words, "evolve."

Photo: Chip Somodevilla/Getty Images

For weeks, Facebook has been touting the precautions it's taking to prevent interference in this year's presidential election, including a new policy designed to stop the spread of viral misinformation, even before it's been flagged as false by fact-checkers.

Now, that policy is being put to the test.

On Wednesday, Facebook announced it was limiting the spread of a controversial story in the New York Post regarding Vice President Joe Biden's son Hunter, instantly prompting cries of censorship from Republicans, including Missouri Sen. Josh Hawley, and questions from content moderation scholars about the company's rationale.

"While I will intentionally not link to the New York Post, I want [to] be clear that this story is eligible to be fact-checked by Facebook's third-party fact checking partners," Facebook spokesperson Andy Stone tweeted. "In the meantime, we are reducing its distribution on our platform."

Later, Stone directed Protocol to the company's recently announced policy on viral misinformation, which states, "In many countries, including in the U.S., if we have signals that a piece of content is false, we temporarily reduce its distribution pending review by a third-party fact-checker." Stone didn't respond to Protocol's question about what signals it's received in this case.

Just last week, Facebook's vice president of integrity, Guy Rosen, called the viral content review system "an additional safety net" on a call with reporters. "This system helps us catch content that our regular flows may not pick up fast enough," Rosen said.

For Facebook, this is precisely the type of story the company has been both preparing for and dreading: A potentially damaging scoop, obtained through questionable means, with obvious ties to Russian interests. The last thing Facebook wants is a repeat of 2016, when its platform became a primary vector in the spread of documents that were hacked by Russian military and then published on the website DCLeaks. So, lately, Facebook's head of cybersecurity policy, Nathaniel Gleicher, has been warning about such "hack and leak" operations to anyone who will listen, urging media outlets to take caution before taking the bait. Last week, Gleicher also warned about a trend called "perception hacking," in which foreign operatives try to feed disinformation to "unwitting news organizations."

So, what is Facebook to do when a story like the Post's comes along, claiming that a mystery laptop that was dropped off at a Delaware computer repair shop and never picked up contains a "smoking gun email," that, if true, suggests Hunter Biden introduced his father to an executive at the Ukrainian energy firm Burisma? It reads like precisely the type of drill that Facebook (and others) have been running to prepare for this election. And so, here is Facebook putting that preparation to use.

Faced with the option of doing nothing, blocking the story from being shared, and reducing its spread, Facebook picked the third door. Twitter, by contrast, picked the second, preventing people from posting the story at all, citing its "hacked materials policy." (Retweets appeared to still go through.) That decision prompted one Post editor to tweet angrily about what he called a "Big Tech information coup" and a "digital civil war."

Facebook's approach is less heavy-handed compared to Twitter's. But these types of decisions are messy nonetheless. Facebook wrote this policy to address a problem of scale — the fact that all the fact-checkers in the world couldn't make their way through all the misinformation and disinformation on a platform of billions. That has meant that misinformation often spreads far and wide, polluting the information ecosystem, before it's fact-checked. One recent example: a viral conspiracy theory video called "Plandemic" exploded on Facebook before being removed days later.

For most of its history, Facebook has been comfortable keeping an arm's length between itself and these decisions, preferring to offload determinations of truth to third parties. But that approach has become increasingly problematic for Facebook, prompting the company to, in Mark Zuckerberg's own words, "evolve" and begin banning previously allowed content like Holocaust denialism and anti-vaccination propaganda. Even then, Facebook has often credited the third-party experts who guided those decisions.

Rarely, if ever, has the company justified these judgment calls with something as fuzzy as whether or not it has "signals" that something might be misinformation. This is one of those calls based not on a steadfast rule, but on a hunch, and a hope that Facebook can create a little bit of friction between potential misinformation and the masses before its fact-checkers get to do their jobs.

The claims of censorship were to be expected. Shortly after Stone tweeted, Sen. Hawley sent Zuckerberg a letter with a series of questions about the decision. "If you have evidence that this news story contains "disinformation" or have otherwise determined that there are inaccuracies with the reporting, will you disclose them to the public so that they can assess your findings?" Hawley asked. "Why did you endeavor to publicly state that such a story was subject to a fact-check? Isn't such a public intervention itself a reflection of Facebook's assessment of a news report's credibility?"

Hawley wrote that the company's intervention suggests "partiality on the part of Facebook."

But not everyone was so critical of this approach. In a lengthy Twitter thread, Renée DiResta, one of the top scholars on the topic of viral misinformation and a technical research manager at Stanford Internet Observatory, wrote that Facebook's decision on the Biden story is "actually a very good use of the policy levers at its disposal."

"There are tradeoffs: if virality is unfettered & nothing is fact-checked, don't be surprised when wild nonsense trends," DiResta wrote. "Provided that this policy is applied in a viewpoint-agnostic way, it seems to be a very solid middle ground for addressing info threats ahead of 2020 and beyond."

It's still unclear what determination Facebook's fact-checkers will make. In the meantime, while conservatives accuse Facebook of censorship, the Hunter Biden story — and Facebook's treatment of it — is getting plenty of exposure on Fox News.

Politics

'Woke tech' and 'the new slave power': Conservatives gather for Vegas summit

An agenda for the event, hosted by the Claremont Institute, listed speakers including U.S. CTO Michael Kratsios and Texas Attorney General Ken Paxton.

The so-called "Digital Statecraft Summit" was organized by the Claremont Institute. The speakers include U.S. CTO Michael Kratsios and Texas Attorney General Ken Paxton, as well as a who's-who of far-right provocateurs.

Photo: David Vives/Unsplash

Conservative investors, political operatives, right-wing writers and Trump administration officials are quietly meeting in Las Vegas this weekend to discuss topics including China, "woke tech" and "the new slave power," according to four people who were invited to attend or speak at the event as well as a copy of the agenda obtained by Protocol.

The so-called "Digital Statecraft Summit" was organized by the Claremont Institute, a conservative think tank that says its mission is to "restore the principles of the American Founding to their rightful, preeminent authority in our national life." A list of speakers for the event includes a combination of past and current government officials as well as a who's who of far-right provocateurs. One speaker, conservative legal scholar John Eastman, rallied the president's supporters at a White House event before the Capitol Hill riot earlier this month. Some others have been associated with racist ideologies.

Keep Reading Show less
Emily Birnbaum

Emily Birnbaum ( @birnbaum_e) is a tech policy reporter with Protocol. Her coverage focuses on the U.S. government's attempts to regulate one of the most powerful industries in the world, with a focus on antitrust, privacy and politics. Previously, she worked as a tech policy reporter with The Hill after spending several months as a breaking news reporter. She is a Bethesda, Maryland native and proud Kenyon College alumna.

Politics

The other reason Facebook silenced Trump? Republicans lost power.

Yes, the president's acts were unprecedented. But Facebook is also preparing for a new Washington, controlled by Democrats.

Mark Zuckerberg and Facebook's head of public policy Joel Kaplan have spent four years bending to conservatives' demands. Now, Facebook is bending in a new direction.

Photo: Samuel Corum/Getty Images

In his post announcing that President Trump would be blocked from posting on Facebook until at least Inauguration Day, Mark Zuckerberg wrote that the president's incitement of the violent mob that stormed the U.S. Capitol building Wednesday was "fundamentally different" than any of the offenses he's committed on Facebook before. "The risks of allowing the President to continue to use our service during this period are simply too great," he wrote on Thursday.

That may be true. But there's another reason why — after four years spent insisting that a tech company has no business shutting up the president of the United States, no matter how much he threatens to shoot protesters or engages in voter suppression — Zuckerberg finally had a change of heart: Republicans just lost power.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Power

Pressure mounts on tech giants to ban Trump, as rioters storm Capitol

Facebook, Twitter and YouTube removed a video in which Trump expressed love for the rioters, but none of the companies have banned him outright — yet.

Twitter locked President Trump's account.

Image: Twitter

Twitter, Facebook and YouTube took action against several of President Trump's posts Wednesday, labeling the posts, limiting reshares and removing a video in which President Trump expressed his love for rioters who stormed the U.S. Capitol building, leading to the evacuation of the Senate, the deployment of the National Guard and to one person being shot and killed. Twitter locked President Trump's account, requiring him to remove three tweets and saying that his account would remain locked for 12 hours after those tweets were removed. Twitter also warned that any future violations would get him banned. Facebook also locked his account for 24 hours, citing "two policy violations." These actions followed a day of calls from tech investors, academics and others to kick Trump off of their platforms once and for all.

In an early tweet, University of Virginia law professor Danielle Citron implored Twitter CEO Jack Dorsey to take action. "As someone who has served on your Trust and Safety Board since its inception and counseled you since 2009, time is now to suspend President Trump's account," Citron wrote. "He has deliberately incited violence, causing mayhem with his lies and threats."

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Politics

Here’s how Big Tech is preparing for regulations in 2021

Companies know that the heat is only going to increase this year.

2021 promises to be a turbulent year for Big Tech.

Photo: Ting Shen/Getty Images

The open internet. Section 230. China. Internet access. 5G. Antitrust. When we asked the policy shops at some of the biggest and most powerful tech companies to identify their 2021 policy priorities, these were the words they had in common.

Each of these issues centers around a common theme. "Despite how tech companies might feel, they've been enjoying a very high innovation phase. They're about to experience a strong regulation phase," said Erika Fisher, Atlassian's general counsel and chief administrative officer. "The question is not if, but how that regulation will be shaped."

Keep Reading Show less
Anna Kramer

Anna Kramer is a reporter at Protocol (@ anna_c_kramer), where she helps write and produce Source Code, Protocol's daily newsletter. Prior to joining the team, she covered tech and small business for the San Francisco Chronicle and privacy for Bloomberg Law. She is a recent graduate of Brown University, where she studied International Relations and Arabic and wrote her senior thesis about surveillance tools and technological development in the Middle East.

Facebook and Google are facing existential legal threats as government regulators and state attorneys bring five separate antitrust cases against them: two against Facebook and three against Google.

Photo: Stefani Reynolds/Getty Images

For the first time ever, there's a real chance that Facebook and Google could be broken up.

It's going to be a tough, years-long battle. But the companies are facing existential legal threats as government regulators and state attorneys bring five separate antitrust cases against them: two against Facebook and three against Google.

Keep Reading Show less
Emily Birnbaum

Emily Birnbaum ( @birnbaum_e) is a tech policy reporter with Protocol. Her coverage focuses on the U.S. government's attempts to regulate one of the most powerful industries in the world, with a focus on antitrust, privacy and politics. Previously, she worked as a tech policy reporter with The Hill after spending several months as a breaking news reporter. She is a Bethesda, Maryland native and proud Kenyon College alumna.

Latest Stories