Policy

Here’s what happened when Facebook stopped protecting users — on purpose

Internal documents reveal the impact of withholding certain integrity protections from a subset of Facebook users.

Facebook Papers: an illustration of the Facebook logo cracking

Frances Haugen said Facebook withholds certain protections from a subset of users to see how they'll react.

Image: Protocol

In her testimony before Congress last month, whistleblower Frances Haugen told lawmakers Facebook has conducted experiments where it withholds certain protections from a subset of users to see how they'll react. Facebook refers to this experimental group internally, she said, as "integrity holdouts."

"These are people who don't get protections from integrity systems to see what happens to them," Haugen said. "And those people who deal with a more toxic, painful version of Facebook use Facebook less."

Internal documents reveal a more complex story. According to one internal report from April 2019, Facebook has studied the impact of removing some protections against problematic content like clickbait and untrustworthy news for some users, but the results, at least in that report, were decidedly mixed.

The report showed that during one March 2019 test, when the company rolled back some protections from millions of users' News Feeds, their exposure to some of the worst forms of harmful content, like graphic violence, barely changed. As the report's author wrote, referring to the company's news feed protections at the time, "We are likely having little (if any) impact on violence."

The report also suggested that far from using Facebook less, integrity holdouts actually commented more and had more sessions on the app. "Given that Integrity is changing a ranking algorithm that is optimized for engagement, it is not surprising that integrity has some negative engagement impact," the report read.

A spokesperson for Haugen said that other documents she collected showed that retention is stronger among regular users than holdouts, but those findings were not included in this report.

The report was included in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Frances Haugen's legal counsel. A consortium of news organizations, including Protocol, has reviewed the redacted versions received by Congress. They offer a glimpse at how Facebook has analyzed the efficacy of its user protections — and weighed them against their impact on other company priorities like growth.

"Testing product and safety features is an important part of improving our platform and something that's standard in tech and many other industries," Facebook spokesperson Drew Pusateri told Protocol, noting that the holdout affected about 2% of Facebook users. "It helps us build the tools to reduce the prevalence of hate speech and other types of problematic content on our platform."

'High-harm spaces'

Facebook conducts holdout experiments for a range of business goals, not just its integrity work. Holdouts are effectively control groups that Facebook can compare to its larger pool of users. As former Facebook data scientist Sophie Zhang recently told Protocol, Facebook has also studied the impact of withholding ads from users. "The company wants to know the very long-term impacts of advertising on retention and usage for Facebook," Zhang said. "The argument was usually that we need to know what the impact of this is. We need to know if people like it or not. But this is also motivated by wanting to know the impact for growth."

By early 2019, it appears, the company had begun applying this approach to integrity protections. The report, published in late April 2019, detailed the initial findings from an experiment that tinkered with certain integrity protections in News Feed. Some of the findings were encouraging: The report showed, for instance, that regular users got substantially less clickbait and ad-farm content than the holdouts did, something the author notes is "not surprising," given that the company was demoting clickbait and ad farms "quite a bit."

The report showed that regular users' exposure to what the company considers "low quality news" was down about 18% compared to the holdouts. The company also found it was boosting content from major publishers more when integrity protections were in place. Holdouts, by contrast, were more likely to click the dislike button and report the posts they saw, and they were also more likely to see content from public pages than regular users were.

But the main takeaway from the report, the author wrote, was that the News Feed protections that were in place at the time weren't having an equally significant effect on more severe types of harm, like graphic violence. "I believe strongly that this needs to change," the author wrote.

During the experiment, the company continued to demote hate speech and graphic violence in at-risk countries, the report said. But for holdouts who weren't in at-risk countries, those demotions didn't exist. And yet, the report found no impact on regular users' exposure to violence compared to the holdouts.

"11% of users see content that has been marked as disturbing every day; 16% of users see content that is likely to be bullying; 39% of users see hateful content (i.e. borderline hate); 32% of users see borderline 3+ nudity content," the author wrote. "These are significant proportions of [daily active users] and we have effectively no ranking interventions in place to mitigate this." The author added, however, that those particular numbers "should be taken with a grain of salt," as measuring bad experiences on the platform was still a work in progress.

The report also made no secret of the negative impact of News Feed integrity protections on engagement. "By definition, Integrity is going to cause some engagement regression," the author wrote, noting that there are "tradeoffs between Integrity and Engagement."

Integrity efforts, the report found, were a blow to the company's "meaningful social interactions" metric, which emphasizes interactions between friends over public-facing content. One reason for that, the author proposed, was that holdouts commented more than regular users did. While regular users were more likely to like posts on Facebook compared to holdouts, the author wrote, it was "not enough to make up for the decline in comments." The report also showed that content views and time spent on the app were down slightly among regular users compared to holdouts.

The report's limitations

It would be easy to construe the findings from this report as a total contradiction of Haugen's claims and a condemnation of integrity work's impact on the worst types of content in general. But that would be a misread, said Sahar Massachi, a former member of Facebook's integrity team and co-founder of the new Integrity Institute think tank. It's important to note, he said, that this document appears to be only looking at integrity protections that existed in News Feed rankings at the time, and doesn't take into account other integrity interventions that other teams at Facebook might have been working on.

It also only looks at the integrity interventions that the News Feed team had already deployed, not the full range of possible interventions that may have been proposed but were shot down. "Their view on what 'integrity' covers is likely scoped to whichever team they're on," Massachi said of the report's author. "I read this as: Integrity interventions that were allowed to ship — in the scoped set that this person considered — didn't affect views of that kind of content."

The report itself isn't clear on exactly what protections were being withheld from the holdouts, but a comment posted along with the document suggests that the experiment affected protections related to clickbait, ad farms, engagement bait and news trustworthiness, among other things. Given that fact, it shouldn't be all that surprising that exposure to graphic violence wasn't impacted by the experiment.

But what the report is calling attention to is the fact that, at the time at least, Facebook's integrity protections for News Feed weren't designed so they would capture more severe harms. The company had only begun demoting what it called "borderline" content that nearly violated its policies a few months before the report was published, and the rollout of those demotions was slow.

"This document says: We should expand the definitions more," said one current Facebook employee who has worked on News Feed ranking and reviewed the report. And according to that employee, the message stuck. "This person's argument was successful in that the program was expanded in various dimensions."

The employee said, however, that some of those expansions were rolled back before Facebook published a public list of content it demotes.

"The story of integrity is you try to do the good thing and you go to the execs, and they shoot you down, and you come back with something more conservative, and you realize you didn't do anything, so you try again," the employee said. "What you're seeing [in this document] is that middle part."

Facebook wouldn't comment on whether the company changed its demotions before publishing its list, but Pusateri said the demotions included on that list are still in place today.

Both Zhang and Massachi — as well as Facebook's own public relations team — cautioned Protocol not to cast Facebook's decision to withhold these protections at all as a scandal in and of itself. Measuring the effectiveness of these interventions, they said, is critical to strengthening them. As Massachi put it: "In the vaccine trials, some people have to get the placebo."

[Editor's note: Below, OCQ stands for "objective content quality," which refers to clickbait and ad-farm content. A high OCQ score means likely clickbait.]


A MESSAGE FROM QUALCOMM

www.protocol.com

Just as the power of the PC fueled the early leaps of the tech revolution and the accessibility of the web built on that, the smartphone and 5G networking technology will reshape our world with blazingly fast connected devices. Leading that charge is 5G, the high-speed next generation of mobile wireless connectivity that will connect virtually everyone and everything, including machines, objects and devices.

LEARN MORE

LA is a growing tech hub. But not everyone may fit.

LA has a housing crisis similar to Silicon Valley’s. And single-family-zoning laws are mostly to blame.

As the number of tech companies in the region grows, so does the number of tech workers, whose high salaries put them at an advantage in both LA's renting and buying markets.

Photo: Nat Rubio-Licht/Protocol

LA’s tech scene is on the rise. The number of unicorn companies in Los Angeles is growing, and the city has become the third-largest startup ecosystem nationally behind the Bay Area and New York with more than 4,000 VC-backed startups in industries ranging from aerospace to creators. As the number of tech companies in the region grows, so does the number of tech workers. The city is quickly becoming more and more like Silicon Valley — a new startup and a dozen tech workers on every corner and companies like Google, Netflix, and Twitter setting up offices there.

But with growth comes growing pains. Los Angeles, especially the burgeoning Silicon Beach area — which includes Santa Monica, Venice, and Marina del Rey — shares something in common with its namesake Silicon Valley: a severe lack of housing.

Keep Reading Show less
Nat Rubio-Licht

Nat Rubio-Licht is a Los Angeles-based news writer at Protocol. They graduated from Syracuse University with a degree in newspaper and online journalism in May 2020. Prior to joining the team, they worked at the Los Angeles Business Journal as a technology and aerospace reporter.

While there remains debate among economists about whether we are officially in a full-blown recession, the signs are certainly there. Like most executives right now, the outlook concerns me.

In any case, businesses aren’t waiting for the official pronouncement. They’re already bracing for impact as U.S. inflation and interest rates soar. Inflation peaked at 9.1% in June 2022 — the highest increase since November 1981 — and the Federal Reserve is targeting an interest rate of 3% by the end of this year.

Keep Reading Show less
Nancy Sansom

Nancy Sansom is the Chief Marketing Officer for Versapay, the leader in Collaborative AR. In this role, she leads marketing, demand generation, product marketing, partner marketing, events, brand, content marketing and communications. She has more than 20 years of experience running successful product and marketing organizations in high-growth software companies focused on HCM and financial technology. Prior to joining Versapay, Nancy served on the senior leadership teams at PlanSource, Benefitfocus and PeopleMatter.

Policy

SFPD can now surveil a private camera network funded by Ripple chair

The San Francisco Board of Supervisors approved a policy that the ACLU and EFF argue will further criminalize marginalized groups.

SFPD will be able to temporarily tap into private surveillance networks in certain circumstances.

Photo: Justin Sullivan/Getty Images

Ripple chairman and co-founder Chris Larsen has been funding a network of security cameras throughout San Francisco for a decade. Now, the city has given its police department the green light to monitor the feeds from those cameras — and any other private surveillance devices in the city — in real time, whether or not a crime has been committed.

This week, San Francisco’s Board of Supervisors approved a controversial plan to allow SFPD to temporarily tap into private surveillance networks during life-threatening emergencies, large events, and in the course of criminal investigations, including investigations of misdemeanors. The decision came despite fervent opposition from groups, including the ACLU of Northern California and the Electronic Frontier Foundation, which say the police department’s new authority will be misused against protesters and marginalized groups in a city that has been a bastion for both.

Keep Reading Show less
Issie Lapowsky

Issie Lapowsky ( @issielapowsky) is Protocol's chief correspondent, covering the intersection of technology, politics, and national affairs. She also oversees Protocol's fellowship program. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University's Center for Publishing on how tech giants have affected publishing.

Enterprise

These two AWS vets think they can finally solve enterprise blockchain

Vendia, founded by Tim Wagner and Shruthi Rao, wants to help companies build real-time, decentralized data applications. Its product allows enterprises to more easily share code and data across clouds, regions, companies, accounts, and technology stacks.

“We have this thesis here: Cloud was always the missing ingredient in blockchain, and Vendia added it in,” Wagner (right) told Protocol of his and Shruthi Rao's company.

Photo: Vendia

The promise of an enterprise blockchain was not lost on CIOs — the idea that a database or an API could keep corporate data consistent with their business partners, be it their upstream supply chains, downstream logistics, or financial partners.

But while it was one of the most anticipated and hyped technologies in recent memory, blockchain also has been one of the most failed technologies in terms of enterprise pilots and implementations, according to Vendia CEO Tim Wagner.

Keep Reading Show less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Fintech

Kraken's CEO got tired of being in finance

Jesse Powell tells Protocol the bureaucratic obligations of running a financial services business contributed to his decision to step back from his role as CEO of one of the world’s largest crypto exchanges.

Photo: David Paul Morris/Bloomberg via Getty Images

Kraken is going through a major leadership change after what has been a tough year for the crypto powerhouse, and for departing CEO Jesse Powell.

The crypto market is still struggling to recover from a major crash, although Kraken appears to have navigated the crisis better than other rivals. Despite his exchange’s apparent success, Powell found himself in the hot seat over allegations published in The New York Times that he made insensitive comments on gender and race that sparked heated conversations within the company.

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.

Latest Stories
Bulletins