Power

Facebook’s viral misinformation policy gets put to the test with Hunter Biden story

The company has limited the spread of the story, citing a policy to curb viral misinformation, even before it's been fact-checked.

Mark Zuckerberg standing in front of his own face on a screen

For most of its history, Facebook has been comfortable keeping an arm's length between itself and these decisions. Now the company is beginning to, in Mark Zuckerberg's own words, "evolve."

Photo: Chip Somodevilla/Getty Images

For weeks, Facebook has been touting the precautions it's taking to prevent interference in this year's presidential election, including a new policy designed to stop the spread of viral misinformation, even before it's been flagged as false by fact-checkers.

Now, that policy is being put to the test.

On Wednesday, Facebook announced it was limiting the spread of a controversial story in the New York Post regarding Vice President Joe Biden's son Hunter, instantly prompting cries of censorship from Republicans, including Missouri Sen. Josh Hawley, and questions from content moderation scholars about the company's rationale.

"While I will intentionally not link to the New York Post, I want [to] be clear that this story is eligible to be fact-checked by Facebook's third-party fact checking partners," Facebook spokesperson Andy Stone tweeted. "In the meantime, we are reducing its distribution on our platform."

Later, Stone directed Protocol to the company's recently announced policy on viral misinformation, which states, "In many countries, including in the U.S., if we have signals that a piece of content is false, we temporarily reduce its distribution pending review by a third-party fact-checker." Stone didn't respond to Protocol's question about what signals it's received in this case.

Just last week, Facebook's vice president of integrity, Guy Rosen, called the viral content review system "an additional safety net" on a call with reporters. "This system helps us catch content that our regular flows may not pick up fast enough," Rosen said.

For Facebook, this is precisely the type of story the company has been both preparing for and dreading: A potentially damaging scoop, obtained through questionable means, with obvious ties to Russian interests. The last thing Facebook wants is a repeat of 2016, when its platform became a primary vector in the spread of documents that were hacked by Russian military and then published on the website DCLeaks. So, lately, Facebook's head of cybersecurity policy, Nathaniel Gleicher, has been warning about such "hack and leak" operations to anyone who will listen, urging media outlets to take caution before taking the bait. Last week, Gleicher also warned about a trend called "perception hacking," in which foreign operatives try to feed disinformation to "unwitting news organizations."

So, what is Facebook to do when a story like the Post's comes along, claiming that a mystery laptop that was dropped off at a Delaware computer repair shop and never picked up contains a "smoking gun email," that, if true, suggests Hunter Biden introduced his father to an executive at the Ukrainian energy firm Burisma? It reads like precisely the type of drill that Facebook (and others) have been running to prepare for this election. And so, here is Facebook putting that preparation to use.

Faced with the option of doing nothing, blocking the story from being shared, and reducing its spread, Facebook picked the third door. Twitter, by contrast, picked the second, preventing people from posting the story at all, citing its "hacked materials policy." (Retweets appeared to still go through.) That decision prompted one Post editor to tweet angrily about what he called a "Big Tech information coup" and a "digital civil war."

Facebook's approach is less heavy-handed compared to Twitter's. But these types of decisions are messy nonetheless. Facebook wrote this policy to address a problem of scale — the fact that all the fact-checkers in the world couldn't make their way through all the misinformation and disinformation on a platform of billions. That has meant that misinformation often spreads far and wide, polluting the information ecosystem, before it's fact-checked. One recent example: a viral conspiracy theory video called "Plandemic" exploded on Facebook before being removed days later.

For most of its history, Facebook has been comfortable keeping an arm's length between itself and these decisions, preferring to offload determinations of truth to third parties. But that approach has become increasingly problematic for Facebook, prompting the company to, in Mark Zuckerberg's own words, "evolve" and begin banning previously allowed content like Holocaust denialism and anti-vaccination propaganda. Even then, Facebook has often credited the third-party experts who guided those decisions.

Rarely, if ever, has the company justified these judgment calls with something as fuzzy as whether or not it has "signals" that something might be misinformation. This is one of those calls based not on a steadfast rule, but on a hunch, and a hope that Facebook can create a little bit of friction between potential misinformation and the masses before its fact-checkers get to do their jobs.

The claims of censorship were to be expected. Shortly after Stone tweeted, Sen. Hawley sent Zuckerberg a letter with a series of questions about the decision. "If you have evidence that this news story contains "disinformation" or have otherwise determined that there are inaccuracies with the reporting, will you disclose them to the public so that they can assess your findings?" Hawley asked. "Why did you endeavor to publicly state that such a story was subject to a fact-check? Isn't such a public intervention itself a reflection of Facebook's assessment of a news report's credibility?"

Hawley wrote that the company's intervention suggests "partiality on the part of Facebook."

But not everyone was so critical of this approach. In a lengthy Twitter thread, Renée DiResta, one of the top scholars on the topic of viral misinformation and a technical research manager at Stanford Internet Observatory, wrote that Facebook's decision on the Biden story is "actually a very good use of the policy levers at its disposal."

"There are tradeoffs: if virality is unfettered & nothing is fact-checked, don't be surprised when wild nonsense trends," DiResta wrote. "Provided that this policy is applied in a viewpoint-agnostic way, it seems to be a very solid middle ground for addressing info threats ahead of 2020 and beyond."

It's still unclear what determination Facebook's fact-checkers will make. In the meantime, while conservatives accuse Facebook of censorship, the Hunter Biden story — and Facebook's treatment of it — is getting plenty of exposure on Fox News.

A 'Soho house for techies': VCs place a bet on community

Contrary is the latest venture firm to experiment with building community spaces instead of offices.

Contrary NYC is meant to re-create being part of a members-only club where engineers and entrepreneurs can hang out together, have a space to work, and host events for people in tech.

Photo: Courtesy of Contrary

In the pre-pandemic times, Contrary’s network of venture scouts, founders, and top technologists reflected the magnetic pull Silicon Valley had on the tech industry. About 80% were based in the Bay Area, with a smattering living elsewhere. Today, when Contrary asked where people in its network were living, the split had changed with 40% in the Bay Area and another 40% living in or planning to move to New York.

It’s totally bifurcated now, said Contrary’s founder Eric Tarczynski.

Keep Reading Show less
Biz Carson

Biz Carson ( @bizcarson) is a San Francisco-based reporter at Protocol, covering Silicon Valley with a focus on startups and venture capital. Previously, she reported for Forbes and was co-editor of Forbes Next Billion-Dollar Startups list. Before that, she worked for Business Insider, Gigaom, and Wired and started her career as a newspaper designer for Gannett.

Sponsored Content

Great products are built on strong patents

Experts say robust intellectual property protection is essential to ensure the long-term R&D required to innovate and maintain America's technology leadership.

Every great tech product that you rely on each day, from the smartphone in your pocket to your music streaming service and navigational system in the car, shares one important thing: part of its innovative design is protected by intellectual property (IP) laws.

From 5G to artificial intelligence, IP protection offers a powerful incentive for researchers to create ground-breaking products, and governmental leaders say its protection is an essential part of maintaining US technology leadership. To quote Secretary of Commerce Gina Raimondo: "intellectual property protection is vital for American innovation and entrepreneurship.”

Keep Reading Show less
James Daly
James Daly has a deep knowledge of creating brand voice identity, including understanding various audiences and targeting messaging accordingly. He enjoys commissioning, editing, writing, and business development, particularly in launching new ventures and building passionate audiences. Daly has led teams large and small to multiple awards and quantifiable success through a strategy built on teamwork, passion, fact-checking, intelligence, analytics, and audience growth while meeting budget goals and production deadlines in fast-paced environments. Daly is the Editorial Director of 2030 Media and a contributor at Wired.
Fintech

Binance CEO wrestles with the 'Chinese company' label

Changpeng "CZ" Zhao, who leads crypto’s largest marketplace, is pushing back on attempts to link Binance to Beijing.

Despite Binance having to abandon its country of origin shortly after its founding, critics have portrayed the exchange as a tool of the Chinese government.

Photo: Akio Kon/Bloomberg via Getty Images

In crypto, he is known simply as CZ, head of one of the industry’s most dominant players.

It took only five years for Binance CEO and co-founder Changpeng Zhao to build his company, which launched in 2017, into the world’s biggest crypto exchange, with 90 million customers and roughly $76 billion in daily trading volume, outpacing the U.S. crypto powerhouse Coinbase.

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.

Enterprise

How I decided to leave the US and pursue a tech career in Europe

Melissa Di Donato moved to Europe to broaden her technology experience with a different market perspective. She planned to stay two years. Seventeen years later, she remains in London as CEO of Suse.

“It was a hard go for me in the beginning. I was entering inside of a company that had been very traditional in a sense.”

Photo: Suse

Click banner image for more How I decided seriesA native New Yorker, Melissa Di Donato made a life-changing decision back in 2005 when she packed up for Europe to further her career in technology. Then with IBM, she made London her new home base.

Today, Di Donato is CEO of Germany’s Suse, now a 30-year-old, open-source enterprise software company that specializes in Linux operating systems, container management, storage, and edge computing. As the company’s first female leader, she has led Suse through the coronavirus pandemic, a 2021 IPO on the Frankfurt Stock Exchange, and the acquisitions of Kubernetes management startup Rancher Labs and container security company NeuVector.

Keep Reading Show less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Enterprise

UiPath had a rocky few years. Rob Enslin wants to turn it around.

Protocol caught up with Enslin, named earlier this year as UiPath’s co-CEO, to discuss why he left Google Cloud, the untapped potential of robotic-process automation, and how he plans to lead alongside founder Daniel Dines.

Rob Enslin, UiPath's co-CEO, chats with Protocol about the company's future.

Photo: UiPath

UiPath has had a shaky history.

The company, which helps companies automate business processes, went public in 2021 at a valuation of more than $30 billion, but now the company’s market capitalization is only around $7 billion. To add insult to injury, UiPath laid off 5% of its staff in June and then lowered its full-year guidance for fiscal year 2023 just months later, tanking its stock by 15%.

Keep Reading Show less
Aisha Counts

Aisha Counts (@aishacounts) is a reporter at Protocol covering enterprise software. Formerly, she was a management consultant for EY. She's based in Los Angeles and can be reached at acounts@protocol.com.

Latest Stories
Bulletins