Facebook has spent years resisting calls to outright forbid climate misinformation on the platform. The company has, instead, touted its Climate Science Center, which launched last year, as one antidote to the problem. But internal documents reveal that earlier this year, users surveyed by the company still largely didn't know the information center existed — and in the United States, were particularly dubious about the accuracy and trustworthiness of the information contained within it.
The documents were contained in disclosures made to the SEC and provided to Congress in redacted form by Frances Haugen's legal counsel. A consortium of news organizations, including Protocol, has reviewed the redacted versions received by Congress. Together, they reveal a company that is still seeking to understand the role it plays in climate denialism, even as its own employees have been sounding the alarm on Facebook's role in spreading climate misinformation for years.
One leaked report (
"The US is the only country that stands out negatively here," with regard to information accuracy, the report read. It also found that users in the U.S. were more likely not to believe in climate change overall than in those other countries and regions.
In a statement, Facebook spokesperson Kevin McAlister said the research "underscores the reasons why we've launched our Climate Science Center and has informed our approach to connecting people with authoritative information about climate change from the world's leading climate change organizations." Facebook stressed that the research was not designed to draw a causal link and that other research has also shown a higher prevalence of climate denial in the U.S. than elsewhere in the world.
One comment on the report from an unnamed Facebook employee also suggested that users who actually visited the center were less likely to believe in climate change than people who didn't visit the center. That comment didn't draw a causal link, either, and the poster cautioned that the data was collected from U.S. and European participants only.
Facebook told Protocol that particular portion of the study was based on a small sample size of 226 people in Europe and 493 people in the US, and said it's possible people who know less about climate change are more likely to visit the center to get information.
"Is that an attack we are prepared for?"
The author of the report also discovered that Facebook is a primary source of climate information for its users, which presented an "opportunity to build knowledge through our platform."
Facebook has continued to make changes to the center as it's expanded to new countries, including adding quizzes and features that debunk common myths, which the survey found users also had difficulty identifying. According to data Facebook released last month, the center now has 100,000 daily visitors and 3.8 million followers. But as with most forms of misinformation, Facebook has continually declined to remove most climate misinformation from the platform, a topic that has been the source of consternation internally for years.
The documents reveal repeated instances of climate change misinformation popping up in prominent places. In one undated internal post, an employee found that a video called "Climate Change Panic is Not Based on Facts" by the conservative group Turning Point USA was the second search result for "climate change" on Facebook Watch and had amassed 6.6 million views in a little over a week.
Another August 2019 document reveals an employee asking why typing "climate change" in Facebook's search bar suggests other searches including "climate change debunked" and "climate change is a hoax."
"If someone is using Facebook Search to deliberately sow doubt and slow down the public response to the climate crisis, they are using our service to jeopardize the lives of billions of people over the coming decades. Is that an attack we are prepared for?" the employee asked.
Two months later, according to the documents, another employee posted a question asking whether the company downranks posts that deny human beings have played a role in climate change. The employee pointed to one such post "denying climate change as man made," but Protocol was not able to see the post.
"Thanks for the question," another employee replied. "We don't remove misinformation except in very narrow cases in which we have strong evidence that the content may lead to imminent offline harm against people." Another comment in that thread explains that Facebook only downranks posts if they're rated false by third-party fact-checkers, a policy that continues today.
Facebook told Protocol that according to its internal research, climate misinformation makes up a small percentage of overall climate content across its apps. The company also recently announced a $1 million grant program to fund its fact-checkers' partnerships with climate experts and helps fact checkers find climate misinformation by using keyword detection to group content about a similar topic together for them.
But experts say that's not good enough. "The Climate Science Center is only a small step but does not address the larger climate disinformation crisis hiding in plain sight," Charlie Cray, a senior strategist for Greenpeace USA, told Protocol. "As we've seen with the fires in Facebook's backyard, active hurricane season and water shortage in Arizona, the dangers of climate change are urgent, real and deadly. Just as Facebook has taken responsibility for its own carbon emissions, it must take responsibility to stop climate deniers from spreading disinformation on its platform."
Despite the pressure building up both internally and externally, the documents show that Facebook is continuing to scrutinize its options for addressing climate change denial on the platform, including conducting interviews with a transcontinental focus group of people who hold a range of views on the climate. The goal in part, according to the documents, was to analyze Facebook's role in shaping climate views and attitudes and understand how users experience climate misinformation and "what they think [Facebook] should do about it."
For some Facebook employees, it seems, the answer to that question is already clear.