Source Code: Your daily look at what matters in tech.

source-codesource codeauthorIssie LapowskyNoneWant your finger on the pulse of everything that's happening in tech? Sign up to get David Pierce's daily newsletter.64fd3cbe9f
×

Get access to Protocol

Your information will be used in accordance with our Privacy Policy

I’m already a subscriber
Power

Facebook’s viral misinformation policy gets put to the test with Hunter Biden story

The company has limited the spread of the story, citing a policy to curb viral misinformation, even before it's been fact-checked.

Mark Zuckerberg standing in front of his own face on a screen

For most of its history, Facebook has been comfortable keeping an arm's length between itself and these decisions. Now the company is beginning to, in Mark Zuckerberg's own words, "evolve."

Photo: Chip Somodevilla/Getty Images

For weeks, Facebook has been touting the precautions it's taking to prevent interference in this year's presidential election, including a new policy designed to stop the spread of viral misinformation, even before it's been flagged as false by fact-checkers.

Now, that policy is being put to the test.

On Wednesday, Facebook announced it was limiting the spread of a controversial story in the New York Post regarding Vice President Joe Biden's son Hunter, instantly prompting cries of censorship from Republicans, including Missouri Sen. Josh Hawley, and questions from content moderation scholars about the company's rationale.

"While I will intentionally not link to the New York Post, I want [to] be clear that this story is eligible to be fact-checked by Facebook's third-party fact checking partners," Facebook spokesperson Andy Stone tweeted. "In the meantime, we are reducing its distribution on our platform."

Later, Stone directed Protocol to the company's recently announced policy on viral misinformation, which states, "In many countries, including in the U.S., if we have signals that a piece of content is false, we temporarily reduce its distribution pending review by a third-party fact-checker." Stone didn't respond to Protocol's question about what signals it's received in this case.

Just last week, Facebook's vice president of integrity, Guy Rosen, called the viral content review system "an additional safety net" on a call with reporters. "This system helps us catch content that our regular flows may not pick up fast enough," Rosen said.

For Facebook, this is precisely the type of story the company has been both preparing for and dreading: A potentially damaging scoop, obtained through questionable means, with obvious ties to Russian interests. The last thing Facebook wants is a repeat of 2016, when its platform became a primary vector in the spread of documents that were hacked by Russian military and then published on the website DCLeaks. So, lately, Facebook's head of cybersecurity policy, Nathaniel Gleicher, has been warning about such "hack and leak" operations to anyone who will listen, urging media outlets to take caution before taking the bait. Last week, Gleicher also warned about a trend called "perception hacking," in which foreign operatives try to feed disinformation to "unwitting news organizations."

So, what is Facebook to do when a story like the Post's comes along, claiming that a mystery laptop that was dropped off at a Delaware computer repair shop and never picked up contains a "smoking gun email," that, if true, suggests Hunter Biden introduced his father to an executive at the Ukrainian energy firm Burisma? It reads like precisely the type of drill that Facebook (and others) have been running to prepare for this election. And so, here is Facebook putting that preparation to use.

Faced with the option of doing nothing, blocking the story from being shared, and reducing its spread, Facebook picked the third door. Twitter, by contrast, picked the second, preventing people from posting the story at all, citing its "hacked materials policy." (Retweets appeared to still go through.) That decision prompted one Post editor to tweet angrily about what he called a "Big Tech information coup" and a "digital civil war."

Facebook's approach is less heavy-handed compared to Twitter's. But these types of decisions are messy nonetheless. Facebook wrote this policy to address a problem of scale — the fact that all the fact-checkers in the world couldn't make their way through all the misinformation and disinformation on a platform of billions. That has meant that misinformation often spreads far and wide, polluting the information ecosystem, before it's fact-checked. One recent example: a viral conspiracy theory video called "Plandemic" exploded on Facebook before being removed days later.

For most of its history, Facebook has been comfortable keeping an arm's length between itself and these decisions, preferring to offload determinations of truth to third parties. But that approach has become increasingly problematic for Facebook, prompting the company to, in Mark Zuckerberg's own words, "evolve" and begin banning previously allowed content like Holocaust denialism and anti-vaccination propaganda. Even then, Facebook has often credited the third-party experts who guided those decisions.

Rarely, if ever, has the company justified these judgment calls with something as fuzzy as whether or not it has "signals" that something might be misinformation. This is one of those calls based not on a steadfast rule, but on a hunch, and a hope that Facebook can create a little bit of friction between potential misinformation and the masses before its fact-checkers get to do their jobs.

The claims of censorship were to be expected. Shortly after Stone tweeted, Sen. Hawley sent Zuckerberg a letter with a series of questions about the decision. "If you have evidence that this news story contains "disinformation" or have otherwise determined that there are inaccuracies with the reporting, will you disclose them to the public so that they can assess your findings?" Hawley asked. "Why did you endeavor to publicly state that such a story was subject to a fact-check? Isn't such a public intervention itself a reflection of Facebook's assessment of a news report's credibility?"

Hawley wrote that the company's intervention suggests "partiality on the part of Facebook."

But not everyone was so critical of this approach. In a lengthy Twitter thread, Renée DiResta, one of the top scholars on the topic of viral misinformation and a technical research manager at Stanford Internet Observatory, wrote that Facebook's decision on the Biden story is "actually a very good use of the policy levers at its disposal."

"There are tradeoffs: if virality is unfettered & nothing is fact-checked, don't be surprised when wild nonsense trends," DiResta wrote. "Provided that this policy is applied in a viewpoint-agnostic way, it seems to be a very solid middle ground for addressing info threats ahead of 2020 and beyond."

It's still unclear what determination Facebook's fact-checkers will make. In the meantime, while conservatives accuse Facebook of censorship, the Hunter Biden story — and Facebook's treatment of it — is getting plenty of exposure on Fox News.

The metaverse is coming, and Robinhood's IPO is here

Plus, what we learned from Big Tech's big quarter.

Image: Roblox

On this episode of the Source Code podcast: First, a few takeaways from another blockbuster quarter in the tech industry. Then, Janko Roettgers joins the show to discuss Big Tech's obsession with the metaverse and the platform war that seems inevitable. Finally, Ben Pimentel talks about Robinhood's IPO, and the company's crazy route to the public markets.

For more on the topics in this episode:

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editor at large. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.

After a year and a half of living and working through a pandemic, it's no surprise that employees are sending out stress signals at record rates. According to a 2021 study by Indeed, 52% of employees today say they feel burnt out. Over half of employees report working longer hours, and a quarter say they're unable to unplug from work.

The continued swell of reported burnout is a concerning trend for employers everywhere. Not only does it harm mental health and well-being, but it can also impact absenteeism, employee retention and — between the drain on morale and high turnover — your company culture.

Crisis management is one thing, but how do you permanently lower the temperature so your teams can recover sustainably? Companies around the world are now taking larger steps to curb burnout, with industry leaders like LinkedIn, Hootsuite and Bumble shutting down their offices for a full week to allow all employees extra time off. The CEO of Okta, worried about burnout, asked all employees to email him their vacation plans in 2021.

Keep Reading Show less
Stella Garber
Stella Garber is Trello's Head of Marketing. Stella has led Marketing at Trello for the last seven years from early stage startup all the way through its acquisition by Atlassian in 2017 and beyond. Stella was an early champion of remote work, having led remote teams for the last decade plus.

Facebook wants to be like Snapchat

Facebook is looking to make posts disappear, Google wants to make traffic reports more accurate, and more patents from Big Tech.

Facebook has ephemeral posts on its mind.

Image: Protocol

Welcome to another week of Big Tech patents. Google wants to make traffic reports more accurate, Amazon wants to make voice assistants more intelligent, Microsoft wants to make scheduling meetings more convenient, and a ton more.

As always, remember that the big tech companies file all kinds of crazy patents for things, and though most never amount to anything, some end up defining the future

Keep Reading Show less
Karyne Levy

Karyne Levy ( @karynelevy) is the West Coast editor at Protocol. Before joining Protocol, Karyne was a senior producer at Scribd, helping to create the original content program. Prior to that she was an assigning editor at NerdWallet, a senior tech editor at Business Insider, and the assistant managing editor at CNET, where she also hosted Rumor Has It for CNET TV. She lives outside San Francisco with her wife, son and lots of pets.

Protocol | China

China’s edtech crackdown isn’t what you think. Here’s why.

It's part of an attempt to fix education inequality and address a looming demographic crisis.

In the past decade, China's private tutoring market has expanded rapidly as it's been digitized and bolstered by capital.

Photo: Getty Images

Beijing's strike against the private tutoring and ed tech industry has rattled the market and led observers to try to answer one big question: What is Beijing trying to achieve?

Sweeping policy guidelines issued by the Central Committee of the Chinese Communist Party on July 24 and the State Council now mandate that existing private tutoring companies register as nonprofit organizations. Extracurricular tutoring companies will be banned from going public. Online tutoring agencies will be subject to regulatory approval.

Keep Reading Show less
Shen Lu

Shen Lu is a reporter with Protocol | China. She has spent six years covering China from inside and outside its borders. Previously, she was a fellow at Asia Society's ChinaFile and a Beijing-based producer for CNN. Her writing has appeared in Foreign Policy, The New York Times and POLITICO, among other publications. Shen Lu is a founding member of Chinese Storytellers, a community serving and elevating Chinese professionals in the global media industry.

It’s soul-destroying and it uses DRM, therefore Peloton is tech

"I mean, the pedals go around if you turn off all the tech, but Peloton isn't selling a pedaling product."

Is this tech? Or is it just a bike with a screen?

Image: Peloton and Protocol

One of the breakout hits from the pandemic, besides Taylor Swift's "Folklore," has been Peloton. With upwards of 5.4 million members as of March and nearly $1.3 billion in revenue that quarter, a lot of people are turning in their gym memberships for a bike or a treadmill and a slick-looking app.

But here at Protocol, it's that slick-looking app, plus all the tech that goes into it, that matters. And that's where things got really heated during our chat this week. Is Peloton tech? Or is it just a bike with a giant tablet on it? Can all bikes be tech with a little elbow grease?

Keep Reading Show less
Karyne Levy

Karyne Levy ( @karynelevy) is the West Coast editor at Protocol. Before joining Protocol, Karyne was a senior producer at Scribd, helping to create the original content program. Prior to that she was an assigning editor at NerdWallet, a senior tech editor at Business Insider, and the assistant managing editor at CNET, where she also hosted Rumor Has It for CNET TV. She lives outside San Francisco with her wife, son and lots of pets.

Latest Stories