Policy

Eight takeaways from Haugen’s testimony on Facebook

Whistleblower Frances Haugen testified the company knows how many kids under 13 are on its platform and has studied how Instagram pushes extreme content — even as executives turned a blind eye to the worst posts in pursuit of growth.

Frances Haugen testifies before Congress.

Facebook whistleblower Frances Haugen alleged the company harmed young users as it pursued growth.

Photo: Jabin Botsford-Pool/Getty Images

Facebook whistleblower Frances Haugen had her day in Congress on Tuesday. In the process, she prompted many lawmakers, who have so far failed to rein in the platform for several years, to say they were newly determined to do something about the company.

Although Haugen's information had contributed greatly to a series of damaging reports on Facebook, her testimony on Tuesday added more details and went even further in confirming company practices and putting data on existing concerns.

Here were the top revelations:

  • Haugen addressed social media addiction, saying Facebook studies a metric called "problematic use," which occurs when users report they can't control their use even when it materially hurts their health, school work or other aspects of their lives. "[5%] to 6% of 14-year-olds have the self-awareness to admit [to] both those questions," Haugen said, adding that the peak of such self-reports occurred at that age. She suggested, however, that those figures underestimate the true scale of the problem.
  • The hearing focused especially on the safety of kids and teens, and many of Haugen's revelations zeroed in on the topic. She was quick to point out she didn't work specifically on those teams, and Facebook attempted to discredit her testimony on that basis. She made clear, however, that she directly drew her conclusions from Facebook's own research. For example, Facebook likes to claim that kids under 13 aren't on the platform, simply because they're not allowed to be — even as the company touts its success in removing tweens and young kids. But, Haugen said, the company can make good guesses about how many kids are on the site who shouldn't be: Research "discovers things like up to 10[%] to 15% of 10-year-olds… may be on Facebook or Instagram."
  • Outside organizations, researchers and even lawmakers who have tried to study how Facebook affects users say that Instagram pushed pro-anorexia content to test accounts purporting to be teens. As part of what's called "proactive incident response," Facebook does its own internal tests on these issues, Haugen said. "They have literally re-created that experiment themselves and confirmed, yes, this happens to people. Facebook knows that they are leading young users to anorexia content."
  • Even when Facebook has turned on artificial intelligence to curtail certain kinds of content, Haugen said, the systems have a poor track record of actually identifying posts on topics such as COVID-19 misinformation: "It's still in the raw form for 80[%], 90% of even that sensitive content."
  • Haugen alleged Facebook misled advertisers who were concerned, in the wake of the George Floyd protests last summer and the insurrection at the Capitol on Jan. 6, that their content might end up near problematic posts. "Facebook said in their talking points that they gave to advertisers, 'We're doing everything in our power to make this safer,' or, 'We take down all the hate speech when we find it,'" she said. "That was not true. They get 3[%] to 5% of hate speech."
  • The stereotype of the lonely older user who gets tricked by misinformation has some truth to it, Haugen said. "Facebook knows that the people who are exposed to the most misinformation are people who are recently widowed, divorced, moved to a new city [or] are isolated in some other way."
  • Haugen said several times that teams devoted to policing controversial content were understaffed, which she said created "implicit discouragement from having better detection systems" for the kinds of content the teams are supposed to monitor. She spoke especially about serving on a counterespionage team, saying: "At any given time, our team could only handle a third of the cases that we knew about. We knew that if we built even a basic detector, we would likely have many more cases."
  • Haugen also said she's "speaking to other parts of Congress" about issues she tracked as part of the counterespionage team, such as Chinese government surveillance of Uighurs. She added: "Facebook's very aware that this is happening on the platform."

Make it worse to make it better

In response to the parade of damaging claims, members of the committee asked Haugen to outline potential fixes for the platform's many problems. Her overall message? We might need to make Facebook a little worse to make it a little better.

Platforms, including Facebook, have long tried to make user actions as effortless as possible — they remove "friction." But some friction, Haugen suggested, may be a good thing, and perhaps we need to slow down some of the things Facebook has long tried to make faster and easier for us.

On Facebook, it's quick and easy to catch up with an old high-school friend, see your new niece or nephew before you can travel across the country, register your feelings on political news or plan who's bringing what to the neighborhood barbecue. While you're there, Facebook gets eyeballs to advertise to, which gives the company ample incentive to keep you on its platforms — and keep you coming back.

Unfortunately, Haugen contends, the algorithms that are constantly keeping us tuned in are also pushing us toward more extreme content. She said mainstream political parties have complained they need to post more and more aggressive positions, as the algorithms find content that generates the angriest responses, shares and comments is the most reliable at keeping users online. Facebook's systems, she said, also prioritize the kinds of fabulous-lifestyle posts on Instagram that tend to make teen users feel unhappy by comparison. And algorithmic amplification has long played a role in making wild falsehoods go viral online.

The solution, Haugen said, includes amending Section 230 — the legal provision that shields online platforms from liability over what users post — so that companies like Facebook have to share in some legal responsibility for what their algorithms promote. She also talked about slowing down the sharing of news by prompting users to read articles before sharing them, as Twitter now does. Ideally, she said, Facebook would return to using a more chronological timeline, showing users content mostly because it's recent, not because it makes them want to leave angry comments, which in turn pushes others to respond with fury.

Haugen compared Facebook to a car — which seems to be the chosen metaphor even of its defenders these days — pointing out state and federal regulators have pretty strict rules for automobiles, which rely on a ton of insight and access into the actual workings of the machines we put on the road. But the metaphor has even more resonance: Haugen was, in essence, calling for the installation of rumble strips and stop signs around misinformation, while allowing people to zoom more quickly down the highway of social media when they're sharing recipes with grandma or connecting with other cancer survivors. Information around kids' health would travel more slowly, the same way we put literal speed restrictions in front of actual schools.

Everything from our kids' mental health to our society's ability to confront COVID-19 and work across political divisions is at stake, Haugen said, adding that the changes don't have to be the end of Facebook or its revenue. "A lot of the changes that I'm talking about are not going to make Facebook an unprofitable company," she said at one point. "It just won't be a ludicrously profitable company."

Interestingly, Haugen said she did not agree with calls to break up Facebook, arguing that its profits, particularly from advertising on Instagram, supported the much-needed research into the effects of the company's algorithm on society.

In addition to changing the incentives that Facebook uses to keep us devoted to social media — regardless of whether the content it pushes is bad for society — Haugen stressed the importance of transparency. She called for the government to establish a new regulatory body with oversight of Facebook and for more opportunities for independent researchers to figure out if the company is truly living up to its public statements to users, investors and lawmakers.

Members of the committee listened, far more respectfully than they often do in such hearings, to Haugen's prescriptions.

Democratic Sen. Richard Blumenthal, who led the hearing, suggested the U.S. Federal Trade Commission and Securities and Exchange Commission should already be taking up the issue of any potential lies under existing authorities right now.

"Facebook appears to have misled the public and investors, and if that's correct, it ought to face real penalties," he said.

And in the name of further transparency, Blumenthal urged Mark Zuckerberg to come testify, yet again, to answer Haugen's claims.

A MESSAGE FROM ALIBABA

www.protocol.com

The future of retail is digital, experiential – and happening now in China. U.S. businesses are going digital and using Alibaba to create immersive experiences to sell to the 900 million Chinese consumers on Alibaba's ecommerce platforms.

LEARN MORE

Fintech

Election markets are far from a sure bet

Kalshi has big-name backing for its plan to offer futures contracts tied to election results. Will that win over a long-skeptical regulator?

Whether Kalshi’s election contracts could be considered gaming or whether they serve a true risk-hedging purpose is one of the top questions the CFTC is weighing in its review.

Photo illustration: Getty Images; Protocol

Crypto isn’t the only emerging issue on the CFTC’s plate. The futures regulator is also weighing a fintech sector that has similarly tricky political implications: election bets.

The Commodity Futures Trading Commission has set Oct. 28 as a date by which it hopes to decide whether the New York-based startup Kalshi can offer a form of wagering up to $25,000 on which party will control the House of Representatives and Senate after the midterms. PredictIt, another online market for election trading, has also sued the regulator over its decision to cancel a no-action letter.

Keep Reading Show less
Ryan Deffenbaugh
Ryan Deffenbaugh is a reporter at Protocol focused on fintech. Before joining Protocol, he reported on New York's technology industry for Crain's New York Business. He is based in New York and can be reached at rdeffenbaugh@protocol.com.
Sponsored Content

Great products are built on strong patents

Experts say robust intellectual property protection is essential to ensure the long-term R&D required to innovate and maintain America's technology leadership.

Every great tech product that you rely on each day, from the smartphone in your pocket to your music streaming service and navigational system in the car, shares one important thing: part of its innovative design is protected by intellectual property (IP) laws.

From 5G to artificial intelligence, IP protection offers a powerful incentive for researchers to create ground-breaking products, and governmental leaders say its protection is an essential part of maintaining US technology leadership. To quote Secretary of Commerce Gina Raimondo: "intellectual property protection is vital for American innovation and entrepreneurship.”

Keep Reading Show less
James Daly
James Daly has a deep knowledge of creating brand voice identity, including understanding various audiences and targeting messaging accordingly. He enjoys commissioning, editing, writing, and business development, particularly in launching new ventures and building passionate audiences. Daly has led teams large and small to multiple awards and quantifiable success through a strategy built on teamwork, passion, fact-checking, intelligence, analytics, and audience growth while meeting budget goals and production deadlines in fast-paced environments. Daly is the Editorial Director of 2030 Media and a contributor at Wired.
Enterprise

The Uber verdict shows why mandatory disclosure isn't such a bad idea

The conviction of Uber's former chief security officer, Joe Sullivan, seems likely to change some minds in the debate over proposed cyber incident reporting regulations.

Executives and boards will now be "a whole lot less likely to cover things up," said one information security veteran.

Photo: Al Drago/Bloomberg via Getty Images

If nothing else, the guilty verdict delivered Wednesday in a case involving Uber's former security head will have this effect on how breaches are handled in the future: Executives and boards, according to information security veteran Michael Hamilton, will be "a whole lot less likely to cover things up."

Following the conviction of former Uber chief security officer Joe Sullivan, "we likely will get better voluntary reporting" of cyber incidents, said Hamilton, formerly the chief information security officer of the City of Seattle, and currently the founder and CISO at cybersecurity vendor Critical Insight.

Keep Reading Show less
Kyle Alspach

Kyle Alspach ( @KyleAlspach) is a senior reporter at Protocol, focused on cybersecurity. He has covered the tech industry since 2010 for outlets including VentureBeat, CRN and the Boston Globe. He lives in Portland, Oregon, and can be reached at kalspach@protocol.com.

Climate

Delta and MIT are running flight tests to fix contrails

The research team and airline are running flight tests to determine if it’s possible to avoid the climate-warming effects of contrails.

Delta and MIT just announced a partnership to test how to mitigate persistent contrails.

Photo: Gabriela Natiello/Unsplash

Contrails could be responsible for up to 2% of all global warming, and yet how they’re formed and how to mitigate them is barely understood by major airlines.

That may be changing.

Keep Reading Show less
Michelle Ma

Michelle Ma (@himichellema) is a reporter at Protocol covering climate. Previously, she was a news editor of live journalism and special coverage for The Wall Street Journal. Prior to that, she worked as a staff writer at Wirecutter. She can be reached at mma@protocol.com.

Entertainment

Inside Amazon’s free video strategy

Amazon has been doubling down on original content for Freevee, its ad-supported video service, which has seen a lot of growth thanks to a deep integration with other Amazon properties.

Freevee’s investment into original programming like 'Bosch: Legacy' has increased by 70%.

Photo: Tyler Golden/Amazon Freevee

Amazon’s streaming efforts have long been all about Prime Video. So the company caught pundits by surprise when, in early 2019, it launched a stand-alone ad-supported streaming service called IMDb Freedive, with Techcrunch calling the move “a bit odd.”

Nearly four years and two rebrandings later, Amazon’s ad-supported video efforts appear to be flourishing. Viewership of the service grew by 138% from 2020 to 2021, according to Amazon. The company declined to share any updated performance data on the service, which is now called Freevee, but a spokesperson told Protocol the performance of originals in particular “exceeded expectations,” leading Amazon to increase investments into original content by 70% year-over-year.

Keep Reading Show less
Janko Roettgers

Janko Roettgers (@jank0) is a senior reporter at Protocol, reporting on the shifting power dynamics between tech, media, and entertainment, including the impact of new technologies. Previously, Janko was Variety's first-ever technology writer in San Francisco, where he covered big tech and emerging technologies. He has reported for Gigaom, Frankfurter Rundschau, Berliner Zeitung, and ORF, among others. He has written three books on consumer cord-cutting and online music and co-edited an anthology on internet subcultures. He lives with his family in Oakland.

Latest Stories
Bulletins