Policy

Eight takeaways from Haugen’s testimony on Facebook

Whistleblower Frances Haugen testified the company knows how many kids under 13 are on its platform and has studied how Instagram pushes extreme content — even as executives turned a blind eye to the worst posts in pursuit of growth.

Frances Haugen testifies before Congress.

Facebook whistleblower Frances Haugen alleged the company harmed young users as it pursued growth.

Photo: Jabin Botsford-Pool/Getty Images

Facebook whistleblower Frances Haugen had her day in Congress on Tuesday. In the process, she prompted many lawmakers, who have so far failed to rein in the platform for several years, to say they were newly determined to do something about the company.

Although Haugen's information had contributed greatly to a series of damaging reports on Facebook, her testimony on Tuesday added more details and went even further in confirming company practices and putting data on existing concerns.

Here were the top revelations:

  • Haugen addressed social media addiction, saying Facebook studies a metric called "problematic use," which occurs when users report they can't control their use even when it materially hurts their health, school work or other aspects of their lives. "[5%] to 6% of 14-year-olds have the self-awareness to admit [to] both those questions," Haugen said, adding that the peak of such self-reports occurred at that age. She suggested, however, that those figures underestimate the true scale of the problem.
  • The hearing focused especially on the safety of kids and teens, and many of Haugen's revelations zeroed in on the topic. She was quick to point out she didn't work specifically on those teams, and Facebook attempted to discredit her testimony on that basis. She made clear, however, that she directly drew her conclusions from Facebook's own research. For example, Facebook likes to claim that kids under 13 aren't on the platform, simply because they're not allowed to be — even as the company touts its success in removing tweens and young kids. But, Haugen said, the company can make good guesses about how many kids are on the site who shouldn't be: Research "discovers things like up to 10[%] to 15% of 10-year-olds… may be on Facebook or Instagram."
  • Outside organizations, researchers and even lawmakers who have tried to study how Facebook affects users say that Instagram pushed pro-anorexia content to test accounts purporting to be teens. As part of what's called "proactive incident response," Facebook does its own internal tests on these issues, Haugen said. "They have literally re-created that experiment themselves and confirmed, yes, this happens to people. Facebook knows that they are leading young users to anorexia content."
  • Even when Facebook has turned on artificial intelligence to curtail certain kinds of content, Haugen said, the systems have a poor track record of actually identifying posts on topics such as COVID-19 misinformation: "It's still in the raw form for 80[%], 90% of even that sensitive content."
  • Haugen alleged Facebook misled advertisers who were concerned, in the wake of the George Floyd protests last summer and the insurrection at the Capitol on Jan. 6, that their content might end up near problematic posts. "Facebook said in their talking points that they gave to advertisers, 'We're doing everything in our power to make this safer,' or, 'We take down all the hate speech when we find it,'" she said. "That was not true. They get 3[%] to 5% of hate speech."
  • The stereotype of the lonely older user who gets tricked by misinformation has some truth to it, Haugen said. "Facebook knows that the people who are exposed to the most misinformation are people who are recently widowed, divorced, moved to a new city [or] are isolated in some other way."
  • Haugen said several times that teams devoted to policing controversial content were understaffed, which she said created "implicit discouragement from having better detection systems" for the kinds of content the teams are supposed to monitor. She spoke especially about serving on a counterespionage team, saying: "At any given time, our team could only handle a third of the cases that we knew about. We knew that if we built even a basic detector, we would likely have many more cases."
  • Haugen also said she's "speaking to other parts of Congress" about issues she tracked as part of the counterespionage team, such as Chinese government surveillance of Uighurs. She added: "Facebook's very aware that this is happening on the platform."

Make it worse to make it better

In response to the parade of damaging claims, members of the committee asked Haugen to outline potential fixes for the platform's many problems. Her overall message? We might need to make Facebook a little worse to make it a little better.

Platforms, including Facebook, have long tried to make user actions as effortless as possible — they remove "friction." But some friction, Haugen suggested, may be a good thing, and perhaps we need to slow down some of the things Facebook has long tried to make faster and easier for us.

On Facebook, it's quick and easy to catch up with an old high-school friend, see your new niece or nephew before you can travel across the country, register your feelings on political news or plan who's bringing what to the neighborhood barbecue. While you're there, Facebook gets eyeballs to advertise to, which gives the company ample incentive to keep you on its platforms — and keep you coming back.

Unfortunately, Haugen contends, the algorithms that are constantly keeping us tuned in are also pushing us toward more extreme content. She said mainstream political parties have complained they need to post more and more aggressive positions, as the algorithms find content that generates the angriest responses, shares and comments is the most reliable at keeping users online. Facebook's systems, she said, also prioritize the kinds of fabulous-lifestyle posts on Instagram that tend to make teen users feel unhappy by comparison. And algorithmic amplification has long played a role in making wild falsehoods go viral online.

The solution, Haugen said, includes amending Section 230 — the legal provision that shields online platforms from liability over what users post — so that companies like Facebook have to share in some legal responsibility for what their algorithms promote. She also talked about slowing down the sharing of news by prompting users to read articles before sharing them, as Twitter now does. Ideally, she said, Facebook would return to using a more chronological timeline, showing users content mostly because it's recent, not because it makes them want to leave angry comments, which in turn pushes others to respond with fury.

Haugen compared Facebook to a car — which seems to be the chosen metaphor even of its defenders these days — pointing out state and federal regulators have pretty strict rules for automobiles, which rely on a ton of insight and access into the actual workings of the machines we put on the road. But the metaphor has even more resonance: Haugen was, in essence, calling for the installation of rumble strips and stop signs around misinformation, while allowing people to zoom more quickly down the highway of social media when they're sharing recipes with grandma or connecting with other cancer survivors. Information around kids' health would travel more slowly, the same way we put literal speed restrictions in front of actual schools.

Everything from our kids' mental health to our society's ability to confront COVID-19 and work across political divisions is at stake, Haugen said, adding that the changes don't have to be the end of Facebook or its revenue. "A lot of the changes that I'm talking about are not going to make Facebook an unprofitable company," she said at one point. "It just won't be a ludicrously profitable company."

Interestingly, Haugen said she did not agree with calls to break up Facebook, arguing that its profits, particularly from advertising on Instagram, supported the much-needed research into the effects of the company's algorithm on society.

In addition to changing the incentives that Facebook uses to keep us devoted to social media — regardless of whether the content it pushes is bad for society — Haugen stressed the importance of transparency. She called for the government to establish a new regulatory body with oversight of Facebook and for more opportunities for independent researchers to figure out if the company is truly living up to its public statements to users, investors and lawmakers.

Members of the committee listened, far more respectfully than they often do in such hearings, to Haugen's prescriptions.

Democratic Sen. Richard Blumenthal, who led the hearing, suggested the U.S. Federal Trade Commission and Securities and Exchange Commission should already be taking up the issue of any potential lies under existing authorities right now.

"Facebook appears to have misled the public and investors, and if that's correct, it ought to face real penalties," he said.

And in the name of further transparency, Blumenthal urged Mark Zuckerberg to come testify, yet again, to answer Haugen's claims.

A MESSAGE FROM ALIBABA

www.protocol.com

The future of retail is digital, experiential – and happening now in China. U.S. businesses are going digital and using Alibaba to create immersive experiences to sell to the 900 million Chinese consumers on Alibaba's ecommerce platforms.

LEARN MORE

Climate

2- and 3-wheelers dominate oil displacement by EVs

Increasingly widespread EV adoption is starting to displace the use of oil, but there's still a lot of work to do.

More electric mopeds on the road could be an oil demand game-changer.

Photo: Humphrey Muleba/Unsplash

Electric vehicles are starting to make a serious dent in oil use.

Last year, EVs displaced roughly 1.5 million barrels per day, according to a new analysis from BloombergNEF. That is more than double the share EVs displaced in 2015. The majority of the displacement is coming from an unlikely source.

Keep Reading Show less
Lisa Martine Jenkins

Lisa Martine Jenkins is a senior reporter at Protocol covering climate. Lisa previously wrote for Morning Consult, Chemical Watch and the Associated Press. Lisa is currently based in Brooklyn, and is originally from the Bay Area. Find her on Twitter ( @l_m_j_) or reach out via email (ljenkins@protocol.com).

Sponsored Content

Foursquare data story: leveraging location data for site selection

We take a closer look at points of interest and foot traffic patterns to demonstrate how location data can be leveraged to inform better site selecti­on strategies.

Imagine: You’re the leader of a real estate team at a restaurant brand looking to open a new location in Manhattan. You have two options you’re evaluating: one site in SoHo, and another site in the Flatiron neighborhood. Which do you choose?

Keep Reading Show less
Enterprise

The limits of AI and automation for digital accessibility

AI and automated software that promises to make the web more accessible abounds, but people with disabilities and those who regularly test for digital accessibility problems say it can only go so far.

The everyday obstacles blocking people with disabilities from a satisfying digital experience are immense.

Image: alexsl/Getty Images

“It’s a lot to listen to a robot all day long,” said Tina Pinedo, communications director at Disability Rights Oregon, a group that works to promote and defend the rights of people with disabilities.

But listening to a machine is exactly what many people with visual impairments do while using screen reading tools to accomplish everyday online tasks such as paying bills or ordering groceries from an ecommerce site.

Keep Reading Show less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Fintech

The crypto crash's violence shocked Circle's CEO

Jeremy Allaire remains upbeat about stablecoins despite the UST wipeout, he told Protocol in an interview.

Allaire said what really caught him by surprise was “how fast the death spiral happened and how violent of a value destruction it was.”

Photo: Heidi Gutman/CNBC/NBCU Photo Bank/NBCUniversal via Getty Images

Circle CEO Jeremy Allaire said he saw the UST meltdown coming about six months ago, long before the stablecoin crash rocked the crypto world.

“This was a house of cards,” he told Protocol. “It was very clear that it was unsustainable and that there would be a very high risk of a death spiral.”

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.

A DTC baby formula startup is caught in the center of a supply chain crisis

After weeks of “unprecedented growth,” Bobbie co-founder Laura Modi made a hard decision: to not accept any more new customers.

Parents unable to track down formula in stores have been turning to Facebook groups, homemade formula recipes and Bobbie, a 4-year-old subscription baby formula company.

Photo: JIM WATSON/AFP via Getty Images

The ongoing baby formula shortage has taken a toll on parents throughout the U.S. Laura Modi, co-founder of formula startup Bobbie, said she’s been “wearing the hat of a mom way more than that of a CEO” in recent weeks.

“It's scary to be a parent right now, with the uncertainty of knowing you can’t find your formula,” Modi told Protocol.

Keep Reading Show less
Nat Rubio-Licht

Nat Rubio-Licht is a Los Angeles-based news writer at Protocol. They graduated from Syracuse University with a degree in newspaper and online journalism in May 2020. Prior to joining the team, they worked at the Los Angeles Business Journal as a technology and aerospace reporter.

Latest Stories
Bulletins