Get access to Protocol
Mark Zuckerberg has said for years that Facebook shouldn't be an "arbiter of truth." But when the coronavirus outbreak hit, his team rushed to ensure Facebook's billions of users would see information from health authorities like the Center for Disease Control and Prevention and the World Health Organization before anything else.
Zuckerberg explained why Facebook has been quicker to take action here than it has been in other contexts, saying that a pandemic is "probably one of the most black-and-white situations" there is, and that, unlike in politics, there are already "broadly trusted authorities" Facebook can direct users to for answers.
But as the science surrounding COVID-19 evolves, the "truth" about the virus and how it spreads is still very much in question, leaving researchers who study Facebook and other social media platforms to wonder whether relying on these "broadly trusted authorities" is such a black-and-white decision after all. When this crisis ends, they worry, will this reliance on government-sponsored sources stick? And what will the repercussions be? The epidemic makes clear just how difficult it remains for social media companies to fight misinformation and encourage the spread of facts.
"The CDC and the WHO are operating with incomplete information at the moment. This is a new disease, something nobody's seen before," Renee DiResta, technical research manager at Stanford Internet Observatory, said Wednesday during a virtual conference organized by the Stanford Institute for Human-Centered Artificial Intelligence. DiResta says that's led to poor communication from "institutional authorities," which raises important questions for the social media companies now relying on them.
Take, for example, the warning the French minister of health sent out, urging people not to take ibuprofen for COVID-19 fevers. The European Medicines Agency and the WHO later refuted this claim, spurring widespread, global confusion about whether ibuprofen was or wasn't safe. Now, a similar debate is brewing within the CDC about whether the agency should revise its current guidelines and encourage even healthy Americans to wear masks.
"How do we appropriately communicate to people where institutions got things wrong, but also contextualize that?" DiResta said. "It's one thing to point to overt conspiracies. It becomes really challenging when institutions are getting things wrong, are slow to react, and are not transparently communicating."
The steps Facebook, YouTube and others have taken to crack down on misinformation about coronavirus and surface reputable information is largely informed by their responses to the 2019 measles outbreak, DiResta said. Decades of research into measles and vaccinations made the science clear: vaccinations are good, anti-vaccination propaganda is bad. So it made sense for Facebook to limit the spread of posts from what DiResta calls the "grifters and the conspiracy theorists" in anti-vaccine groups and surface more reputable content from the CDC and WHO.
Given how much is still unknown about COVID-19 — as well as what's becoming known about how global governments have withheld information about the virus — DiResta said it's not clear that "institutional credentialism" is as reliable a marker of trustworthiness in this context.
That puts social media platforms in a trickier position, she said, of needing to "continue to minimize the grift and the conspiracy, while at the same time recognizing that you don't want to over-index on institutional authority."
In deferring to organizations like the CDC and WHO, tech giants also run the risk of giving whoever holds power in government full control over the information people are receiving about the virus online. That's one reason Democrats have condemned Google's policy of prohibiting nongovernmental ads related to the virus. They argue it gives the Trump administration the power to shape the narrative around the crisis. That sort of deference to government agencies could be especially dangerous in dictatorships around the world.
"When [platforms] privilege authoritative voices in some contexts, that sounds great, in other contexts, well, that doesn't sound so great," Kate Starbird, associate professor of human centered design and engineering at the University of Washington, said during the virtual event. "I've been critical of similar policies in the China context, and now we're seeing them in the U.S. And in some cases, we're celebrating."
Get in touch with us: Share information securely with Protocol via encrypted Signal or WhatsApp message, at 415-214-4715 or through our anonymous SecureDrop.
It's true that for years, reporters, academics, lawmakers and even the public have pushed Facebook, Google, Twitter and other platforms to do more to stop the spread of misinformation online. Now that they're stepping up to help do just that in the current crisis, some are understandably applauding those efforts. Neither DiResta nor Starbird are suggesting that should stop.
But Starbird warned that crises tend to make people more accepting of heavy-handed policies. That's true of public health policies like social distancing and information censorship. When this crisis does, eventually, resolve, she said, "I really think we need to have a period of reckoning about the changes that they've made and see if they're things that we want them to extend or if they demand new criticism."