Policy

Here’s what happened when Facebook stopped protecting users — on purpose

Internal documents reveal the impact of withholding certain integrity protections from a subset of Facebook users.

Facebook Papers: an illustration of the Facebook logo cracking

Frances Haugen said Facebook withholds certain protections from a subset of users to see how they'll react.

Image: Protocol

In her testimony before Congress last month, whistleblower Frances Haugen told lawmakers Facebook has conducted experiments where it withholds certain protections from a subset of users to see how they'll react. Facebook refers to this experimental group internally, she said, as "integrity holdouts."

"These are people who don't get protections from integrity systems to see what happens to them," Haugen said. "And those people who deal with a more toxic, painful version of Facebook use Facebook less."

Internal documents reveal a more complex story. According to one internal report from April 2019, Facebook has studied the impact of removing some protections against problematic content like clickbait and untrustworthy news for some users, but the results, at least in that report, were decidedly mixed.

The report showed that during one March 2019 test, when the company rolled back some protections from millions of users' News Feeds, their exposure to some of the worst forms of harmful content, like graphic violence, barely changed. As the report's author wrote, referring to the company's news feed protections at the time, "We are likely having little (if any) impact on violence."

The report also suggested that far from using Facebook less, integrity holdouts actually commented more and had more sessions on the app. "Given that Integrity is changing a ranking algorithm that is optimized for engagement, it is not surprising that integrity has some negative engagement impact," the report read.

A spokesperson for Haugen said that other documents she collected showed that retention is stronger among regular users than holdouts, but those findings were not included in this report.

The report was included in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Frances Haugen's legal counsel. A consortium of news organizations, including Protocol, has reviewed the redacted versions received by Congress. They offer a glimpse at how Facebook has analyzed the efficacy of its user protections — and weighed them against their impact on other company priorities like growth.

"Testing product and safety features is an important part of improving our platform and something that's standard in tech and many other industries," Facebook spokesperson Drew Pusateri told Protocol, noting that the holdout affected about 2% of Facebook users. "It helps us build the tools to reduce the prevalence of hate speech and other types of problematic content on our platform."

'High-harm spaces'

Facebook conducts holdout experiments for a range of business goals, not just its integrity work. Holdouts are effectively control groups that Facebook can compare to its larger pool of users. As former Facebook data scientist Sophie Zhang recently told Protocol, Facebook has also studied the impact of withholding ads from users. "The company wants to know the very long-term impacts of advertising on retention and usage for Facebook," Zhang said. "The argument was usually that we need to know what the impact of this is. We need to know if people like it or not. But this is also motivated by wanting to know the impact for growth."

By early 2019, it appears, the company had begun applying this approach to integrity protections. The report, published in late April 2019, detailed the initial findings from an experiment that tinkered with certain integrity protections in News Feed. Some of the findings were encouraging: The report showed, for instance, that regular users got substantially less clickbait and ad-farm content than the holdouts did, something the author notes is "not surprising," given that the company was demoting clickbait and ad farms "quite a bit."

The report showed that regular users' exposure to what the company considers "low quality news" was down about 18% compared to the holdouts. The company also found it was boosting content from major publishers more when integrity protections were in place. Holdouts, by contrast, were more likely to click the dislike button and report the posts they saw, and they were also more likely to see content from public pages than regular users were.

But the main takeaway from the report, the author wrote, was that the News Feed protections that were in place at the time weren't having an equally significant effect on more severe types of harm, like graphic violence. "I believe strongly that this needs to change," the author wrote.

During the experiment, the company continued to demote hate speech and graphic violence in at-risk countries, the report said. But for holdouts who weren't in at-risk countries, those demotions didn't exist. And yet, the report found no impact on regular users' exposure to violence compared to the holdouts.

"11% of users see content that has been marked as disturbing every day; 16% of users see content that is likely to be bullying; 39% of users see hateful content (i.e. borderline hate); 32% of users see borderline 3+ nudity content," the author wrote. "These are significant proportions of [daily active users] and we have effectively no ranking interventions in place to mitigate this." The author added, however, that those particular numbers "should be taken with a grain of salt," as measuring bad experiences on the platform was still a work in progress.

The report also made no secret of the negative impact of News Feed integrity protections on engagement. "By definition, Integrity is going to cause some engagement regression," the author wrote, noting that there are "tradeoffs between Integrity and Engagement."

Integrity efforts, the report found, were a blow to the company's "meaningful social interactions" metric, which emphasizes interactions between friends over public-facing content. One reason for that, the author proposed, was that holdouts commented more than regular users did. While regular users were more likely to like posts on Facebook compared to holdouts, the author wrote, it was "not enough to make up for the decline in comments." The report also showed that content views and time spent on the app were down slightly among regular users compared to holdouts.

The report's limitations

It would be easy to construe the findings from this report as a total contradiction of Haugen's claims and a condemnation of integrity work's impact on the worst types of content in general. But that would be a misread, said Sahar Massachi, a former member of Facebook's integrity team and co-founder of the new Integrity Institute think tank. It's important to note, he said, that this document appears to be only looking at integrity protections that existed in News Feed rankings at the time, and doesn't take into account other integrity interventions that other teams at Facebook might have been working on.

It also only looks at the integrity interventions that the News Feed team had already deployed, not the full range of possible interventions that may have been proposed but were shot down. "Their view on what 'integrity' covers is likely scoped to whichever team they're on," Massachi said of the report's author. "I read this as: Integrity interventions that were allowed to ship — in the scoped set that this person considered — didn't affect views of that kind of content."

The report itself isn't clear on exactly what protections were being withheld from the holdouts, but a comment posted along with the document suggests that the experiment affected protections related to clickbait, ad farms, engagement bait and news trustworthiness, among other things. Given that fact, it shouldn't be all that surprising that exposure to graphic violence wasn't impacted by the experiment.

But what the report is calling attention to is the fact that, at the time at least, Facebook's integrity protections for News Feed weren't designed so they would capture more severe harms. The company had only begun demoting what it called "borderline" content that nearly violated its policies a few months before the report was published, and the rollout of those demotions was slow.

"This document says: We should expand the definitions more," said one current Facebook employee who has worked on News Feed ranking and reviewed the report. And according to that employee, the message stuck. "This person's argument was successful in that the program was expanded in various dimensions."

The employee said, however, that some of those expansions were rolled back before Facebook published a public list of content it demotes.

"The story of integrity is you try to do the good thing and you go to the execs, and they shoot you down, and you come back with something more conservative, and you realize you didn't do anything, so you try again," the employee said. "What you're seeing [in this document] is that middle part."

Facebook wouldn't comment on whether the company changed its demotions before publishing its list, but Pusateri said the demotions included on that list are still in place today.

Both Zhang and Massachi — as well as Facebook's own public relations team — cautioned Protocol not to cast Facebook's decision to withhold these protections at all as a scandal in and of itself. Measuring the effectiveness of these interventions, they said, is critical to strengthening them. As Massachi put it: "In the vaccine trials, some people have to get the placebo."

[Editor's note: Below, OCQ stands for "objective content quality," which refers to clickbait and ad-farm content. A high OCQ score means likely clickbait.]


A MESSAGE FROM QUALCOMM

www.protocol.com

Just as the power of the PC fueled the early leaps of the tech revolution and the accessibility of the web built on that, the smartphone and 5G networking technology will reshape our world with blazingly fast connected devices. Leading that charge is 5G, the high-speed next generation of mobile wireless connectivity that will connect virtually everyone and everything, including machines, objects and devices.

LEARN MORE

Policy

Google is wooing a coalition of civil rights allies. It’s working.

The tech giant is adept at winning friends even when it’s not trying to immediately influence people.

A map display of Washington lines the floor next to the elevators at the Google office in Washington, D.C.

Photo: Andrew Harrer/Bloomberg via Getty Images

As Google has faced intensifying pressure from policymakers in recent years, it’s founded trade associations, hired a roster of former top government officials and sometimes spent more than $20 million annually on federal lobbying.

But the company has also become famous in Washington for nurturing less clearly mercenary ties. It has long funded the work of laissez-faire economists who now defend it against antitrust charges, for instance. It’s making inroads with traditional business associations that once pummeled it on policy, and also supports think tanks and advocacy groups.

Keep Reading Show less
Ben Brody

Ben Brody (@ BenBrodyDC) is a senior reporter at Protocol focusing on how Congress, courts and agencies affect the online world we live in. He formerly covered tech policy and lobbying (including antitrust, Section 230 and privacy) at Bloomberg News, where he previously reported on the influence industry, government ethics and the 2016 presidential election. Before that, Ben covered business news at CNNMoney and AdAge, and all manner of stories in and around New York. He still loves appearing on the New York news radio he grew up with.

Sustainability. It can be a charged word in the context of blockchain and crypto – whether from outsiders with a limited view of the technology or from insiders using it for competitive advantage. But as a CEO in the industry, I don’t think either of those approaches helps us move forward. We should all be able to agree that using less energy to get a task done is a good thing and that there is room for improvement in the amount of energy that is consumed to power different blockchain technologies.

So, what if we put the enormous industry talent and minds that have created and developed blockchain to the task of building in a more energy-efficient manner? Can we not just solve the issues but also set the standard for other industries to develop technology in a future-proof way?

Keep Reading Show less
Denelle Dixon, CEO of SDF

Denelle Dixon is CEO and Executive Director of the Stellar Development Foundation, a non-profit using blockchain to unlock economic potential by making money more fluid, markets more open, and people more empowered. Previously, Dixon served as COO of Mozilla. Leading the business, revenue and policy teams, she fought for Net Neutrality and consumer privacy protections and was responsible for commercial partnerships. Denelle also served as general counsel and legal advisor in private equity and technology.

Workplace

Everything you need to know about tech layoffs and hiring slowdowns

Will tech companies and startups continue to have layoffs?

It’s not just early-stage startups that are feeling the burn.

Photo: Kirsty O'Connor/PA Images via Getty Images

What goes up must come down.

High-flying startups with record valuations, huge hiring goals and ambitious expansion plans are now announcing hiring slowdowns, freezes and in some cases widespread layoffs. It’s the dot-com bust all over again — this time, without the cute sock puppet and in the midst of a global pandemic we just can’t seem to shake.

Keep Reading Show less
Nat Rubio-Licht

Nat Rubio-Licht is a Los Angeles-based news writer at Protocol. They graduated from Syracuse University with a degree in newspaper and online journalism in May 2020. Prior to joining the team, they worked at the Los Angeles Business Journal as a technology and aerospace reporter.

Entertainment

Sink into ‘Love, Death & Robots’ and more weekend recs

Don’t know what to do this weekend? We’ve got you covered.

Our favorite picks for your weekend pleasure.

Image: A24; 11 bit studios; Getty Images

We could all use a bit of a break. This weekend we’re diving into Netflix’s beautifully animated sci-fi “Love, Death & Robots,” losing ourselves in surreal “Men” and loving Zelda-like Moonlighter.

Keep Reading Show less
Nick Statt

Nick Statt is Protocol's video game reporter. Prior to joining Protocol, he was news editor at The Verge covering the gaming industry, mobile apps and antitrust out of San Francisco, in addition to managing coverage of Silicon Valley tech giants and startups. He now resides in Rochester, New York, home of the garbage plate and, completely coincidentally, the World Video Game Hall of Fame. He can be reached at nstatt@protocol.com.

Workplace

This machine would like to interview you for a job

Companies are embracing automated video interviews to filter through floods of job applicants. But interviews with a computer screen raise big ethical questions and might scare off candidates.

Although automated interview companies claim to reduce bias in hiring, the researchers and advocates who study AI bias are these companies’ most frequent critics.

Photo: Johner Images via Getty Images

Applying for a job these days is starting to feel a lot like online dating. Job-seekers send their resume into portal after portal and a silent abyss waits on the other side.

That abyss is silent for a reason and it has little to do with the still-tight job market or the quality of your particular resume. On the other side of the portal, hiring managers watch the hundreds and even thousands of resumes pile up. It’s an infinite mountain of digital profiles, most of them from people completely unqualified. Going through them all would be a virtually fruitless task.

Keep Reading Show less
Anna Kramer

Anna Kramer is a reporter at Protocol (Twitter: @ anna_c_kramer, email: akramer@protocol.com), where she writes about labor and workplace issues. Prior to joining the team, she covered tech and small business for the San Francisco Chronicle and privacy for Bloomberg Law. She is a recent graduate of Brown University, where she studied International Relations and Arabic and wrote her senior thesis about surveillance tools and technological development in the Middle East.

Latest Stories
Bulletins