Power

Facebook’s viral misinformation policy gets put to the test with Hunter Biden story

The company has limited the spread of the story, citing a policy to curb viral misinformation, even before it's been fact-checked.

Mark Zuckerberg standing in front of his own face on a screen

For most of its history, Facebook has been comfortable keeping an arm's length between itself and these decisions. Now the company is beginning to, in Mark Zuckerberg's own words, "evolve."

Photo: Chip Somodevilla/Getty Images

For weeks, Facebook has been touting the precautions it's taking to prevent interference in this year's presidential election, including a new policy designed to stop the spread of viral misinformation, even before it's been flagged as false by fact-checkers.

Now, that policy is being put to the test.

On Wednesday, Facebook announced it was limiting the spread of a controversial story in the New York Post regarding Vice President Joe Biden's son Hunter, instantly prompting cries of censorship from Republicans, including Missouri Sen. Josh Hawley, and questions from content moderation scholars about the company's rationale.

"While I will intentionally not link to the New York Post, I want [to] be clear that this story is eligible to be fact-checked by Facebook's third-party fact checking partners," Facebook spokesperson Andy Stone tweeted. "In the meantime, we are reducing its distribution on our platform."

Later, Stone directed Protocol to the company's recently announced policy on viral misinformation, which states, "In many countries, including in the U.S., if we have signals that a piece of content is false, we temporarily reduce its distribution pending review by a third-party fact-checker." Stone didn't respond to Protocol's question about what signals it's received in this case.

Just last week, Facebook's vice president of integrity, Guy Rosen, called the viral content review system "an additional safety net" on a call with reporters. "This system helps us catch content that our regular flows may not pick up fast enough," Rosen said.

For Facebook, this is precisely the type of story the company has been both preparing for and dreading: A potentially damaging scoop, obtained through questionable means, with obvious ties to Russian interests. The last thing Facebook wants is a repeat of 2016, when its platform became a primary vector in the spread of documents that were hacked by Russian military and then published on the website DCLeaks. So, lately, Facebook's head of cybersecurity policy, Nathaniel Gleicher, has been warning about such "hack and leak" operations to anyone who will listen, urging media outlets to take caution before taking the bait. Last week, Gleicher also warned about a trend called "perception hacking," in which foreign operatives try to feed disinformation to "unwitting news organizations."

So, what is Facebook to do when a story like the Post's comes along, claiming that a mystery laptop that was dropped off at a Delaware computer repair shop and never picked up contains a "smoking gun email," that, if true, suggests Hunter Biden introduced his father to an executive at the Ukrainian energy firm Burisma? It reads like precisely the type of drill that Facebook (and others) have been running to prepare for this election. And so, here is Facebook putting that preparation to use.

Faced with the option of doing nothing, blocking the story from being shared, and reducing its spread, Facebook picked the third door. Twitter, by contrast, picked the second, preventing people from posting the story at all, citing its "hacked materials policy." (Retweets appeared to still go through.) That decision prompted one Post editor to tweet angrily about what he called a "Big Tech information coup" and a "digital civil war."

Facebook's approach is less heavy-handed compared to Twitter's. But these types of decisions are messy nonetheless. Facebook wrote this policy to address a problem of scale — the fact that all the fact-checkers in the world couldn't make their way through all the misinformation and disinformation on a platform of billions. That has meant that misinformation often spreads far and wide, polluting the information ecosystem, before it's fact-checked. One recent example: a viral conspiracy theory video called "Plandemic" exploded on Facebook before being removed days later.

For most of its history, Facebook has been comfortable keeping an arm's length between itself and these decisions, preferring to offload determinations of truth to third parties. But that approach has become increasingly problematic for Facebook, prompting the company to, in Mark Zuckerberg's own words, "evolve" and begin banning previously allowed content like Holocaust denialism and anti-vaccination propaganda. Even then, Facebook has often credited the third-party experts who guided those decisions.

Rarely, if ever, has the company justified these judgment calls with something as fuzzy as whether or not it has "signals" that something might be misinformation. This is one of those calls based not on a steadfast rule, but on a hunch, and a hope that Facebook can create a little bit of friction between potential misinformation and the masses before its fact-checkers get to do their jobs.

The claims of censorship were to be expected. Shortly after Stone tweeted, Sen. Hawley sent Zuckerberg a letter with a series of questions about the decision. "If you have evidence that this news story contains "disinformation" or have otherwise determined that there are inaccuracies with the reporting, will you disclose them to the public so that they can assess your findings?" Hawley asked. "Why did you endeavor to publicly state that such a story was subject to a fact-check? Isn't such a public intervention itself a reflection of Facebook's assessment of a news report's credibility?"

Hawley wrote that the company's intervention suggests "partiality on the part of Facebook."

But not everyone was so critical of this approach. In a lengthy Twitter thread, Renée DiResta, one of the top scholars on the topic of viral misinformation and a technical research manager at Stanford Internet Observatory, wrote that Facebook's decision on the Biden story is "actually a very good use of the policy levers at its disposal."

"There are tradeoffs: if virality is unfettered & nothing is fact-checked, don't be surprised when wild nonsense trends," DiResta wrote. "Provided that this policy is applied in a viewpoint-agnostic way, it seems to be a very solid middle ground for addressing info threats ahead of 2020 and beyond."

It's still unclear what determination Facebook's fact-checkers will make. In the meantime, while conservatives accuse Facebook of censorship, the Hunter Biden story — and Facebook's treatment of it — is getting plenty of exposure on Fox News.

Protocol | Fintech

How European fintech startup N26 is preparing for U.S. regulations

"There's a lot more scrutiny being placed on fintech. We are definitely mindful of it."

In an interview with Protocol, Stephanie Balint, N26's U.S. general manager, discussed the company's approach to regulations in the U.S.

Photo: N26

N26's monster $900 million funding round announced Monday underlined the German startup's momentum in the digital banking market.

Stephanie Balint, N26's U.S. general manager, said the funding will be used for expansion and also to improve "our core offering to make this the most reliable bank that our customers can trust," she told Protocol.

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Signal at (510)731-8429.

The way we work has fundamentally changed. COVID-19 upended business dealings and office work processes, putting into hyperdrive a move towards digital collaboration platforms that allow teams to streamline processes and communicate from anywhere. According to the International Data Corporation, the revenue for worldwide collaboration applications increased 32.9 percent from 2019 to 2020, reaching $22.6 billion; it's expected to become a $50.7 billion industry by 2025.

"While consumers and early adopter businesses had widely embraced collaborative applications prior to the pandemic, the market saw five years' worth of new users in the first six months of 2020," said Wayne Kurtzman, research director of social and collaboration at IDC. "This has cemented collaboration, at least to some extent, for every business, large and small."

Keep Reading Show less
Kate Silver

Kate Silver is an award-winning reporter and editor with 15-plus years of journalism experience. Based in Chicago, she specializes in feature and business reporting. Kate's reporting has appeared in the Washington Post, The Chicago Tribune, The Atlantic's CityLab, Atlas Obscura, The Telegraph and many other outlets.

Apple’s new MacBooks are the future — and the past

After years of reinventing the wheel, Apple's back to just building really good ones.

Apple brought back the ports.

Photo: Apple

The 2015 Pro was, by most accounts, one of the best laptops Apple ever made. It was fast and functional, and it had a great screen, a MagSafe charger, plenty of ports, a great keyboard and solid battery life. If you walked around practically any office in Silicon Valley, you'd see Pros everywhere.

Many of those users have been holding on to their increasingly old and dusty 2015 Pros, too, because right about when that computer came out was when Apple seemed to lose its way in the laptop market. It released the 12-inch MacBook, an incredibly thin and light computer that made a bunch of changes — a new keyboard and trackpad design chief among them — that eventually made their way around the rest of the MacBook lineup. Then came the Touch Bar, Apple's attempt to build an entirely new user interface into a laptop.

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editorial director. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.

Image: Christopher T. Fong/Protocol

Imagine a company where there are no meetings — just time for deep, focused work punctuated by short conversations on Slack and project updates on Trello.

Now imagine a company where the no-meeting ethos is so ingrained that it's possible to work there for 10 years without ever speaking face-to-face with a single coworker, and for your boss to not even recognize the sound of your voice.

Keep Reading Show less
Michelle Ma
Michelle Ma (@himichellema) is a reporter at Protocol, where she writes about management, leadership and workplace issues in tech. Previously, she was a news editor of live journalism and special coverage for The Wall Street Journal. Prior to that, she worked as a staff writer at Wirecutter. She can be reached at mma@protocol.com.
Protocol | Workplace

#AppleToo activist says Apple fired her for deleting apps from her devices

Janneke Parrish says she was dismissed after deleting Robinhood, Pokemon Go and Google Drive from her work devices during an investigation inside the company.

The Apple Too movement is trying to organize Apple workers into a collective movement.
Photo: Bloomberg via Getty

Unlike most other companies, Apple asks that its employees use their work phones like personal ones — and for five years, Apple program manager Janneke Parrish did as she was told. But last week, when Apple asked Parrish for her devices in an internal investigation, she was afraid Apple would see her personal and private information. She disobeyed orders and deleted apps like Robinhood, Pokemon Go and Google Drive. Then Apple fired her.

Keep Reading Show less
Anna Kramer

Anna Kramer is a reporter at Protocol (Twitter: @ anna_c_kramer, email: akramer@protocol.com), where she writes about labor and workplace issues. Prior to joining the team, she covered tech and small business for the San Francisco Chronicle and privacy for Bloomberg Law. She is a recent graduate of Brown University, where she studied International Relations and Arabic and wrote her senior thesis about surveillance tools and technological development in the Middle East.

Latest Stories