Policy

Eight takeaways from Haugen’s testimony on Facebook

Whistleblower Frances Haugen testified the company knows how many kids under 13 are on its platform and has studied how Instagram pushes extreme content — even as executives turned a blind eye to the worst posts in pursuit of growth.

Frances Haugen testifies before Congress.

Facebook whistleblower Frances Haugen alleged the company harmed young users as it pursued growth.

Photo: Jabin Botsford-Pool/Getty Images

Facebook whistleblower Frances Haugen had her day in Congress on Tuesday. In the process, she prompted many lawmakers, who have so far failed to rein in the platform for several years, to say they were newly determined to do something about the company.

Although Haugen's information had contributed greatly to a series of damaging reports on Facebook, her testimony on Tuesday added more details and went even further in confirming company practices and putting data on existing concerns.

Here were the top revelations:

  • Haugen addressed social media addiction, saying Facebook studies a metric called "problematic use," which occurs when users report they can't control their use even when it materially hurts their health, school work or other aspects of their lives. "[5%] to 6% of 14-year-olds have the self-awareness to admit [to] both those questions," Haugen said, adding that the peak of such self-reports occurred at that age. She suggested, however, that those figures underestimate the true scale of the problem.
  • The hearing focused especially on the safety of kids and teens, and many of Haugen's revelations zeroed in on the topic. She was quick to point out she didn't work specifically on those teams, and Facebook attempted to discredit her testimony on that basis. She made clear, however, that she directly drew her conclusions from Facebook's own research. For example, Facebook likes to claim that kids under 13 aren't on the platform, simply because they're not allowed to be — even as the company touts its success in removing tweens and young kids. But, Haugen said, the company can make good guesses about how many kids are on the site who shouldn't be: Research "discovers things like up to 10[%] to 15% of 10-year-olds… may be on Facebook or Instagram."
  • Outside organizations, researchers and even lawmakers who have tried to study how Facebook affects users say that Instagram pushed pro-anorexia content to test accounts purporting to be teens. As part of what's called "proactive incident response," Facebook does its own internal tests on these issues, Haugen said. "They have literally re-created that experiment themselves and confirmed, yes, this happens to people. Facebook knows that they are leading young users to anorexia content."
  • Even when Facebook has turned on artificial intelligence to curtail certain kinds of content, Haugen said, the systems have a poor track record of actually identifying posts on topics such as COVID-19 misinformation: "It's still in the raw form for 80[%], 90% of even that sensitive content."
  • Haugen alleged Facebook misled advertisers who were concerned, in the wake of the George Floyd protests last summer and the insurrection at the Capitol on Jan. 6, that their content might end up near problematic posts. "Facebook said in their talking points that they gave to advertisers, 'We're doing everything in our power to make this safer,' or, 'We take down all the hate speech when we find it,'" she said. "That was not true. They get 3[%] to 5% of hate speech."
  • The stereotype of the lonely older user who gets tricked by misinformation has some truth to it, Haugen said. "Facebook knows that the people who are exposed to the most misinformation are people who are recently widowed, divorced, moved to a new city [or] are isolated in some other way."
  • Haugen said several times that teams devoted to policing controversial content were understaffed, which she said created "implicit discouragement from having better detection systems" for the kinds of content the teams are supposed to monitor. She spoke especially about serving on a counterespionage team, saying: "At any given time, our team could only handle a third of the cases that we knew about. We knew that if we built even a basic detector, we would likely have many more cases."
  • Haugen also said she's "speaking to other parts of Congress" about issues she tracked as part of the counterespionage team, such as Chinese government surveillance of Uighurs. She added: "Facebook's very aware that this is happening on the platform."

Make it worse to make it better

In response to the parade of damaging claims, members of the committee asked Haugen to outline potential fixes for the platform's many problems. Her overall message? We might need to make Facebook a little worse to make it a little better.

Platforms, including Facebook, have long tried to make user actions as effortless as possible — they remove "friction." But some friction, Haugen suggested, may be a good thing, and perhaps we need to slow down some of the things Facebook has long tried to make faster and easier for us.

On Facebook, it's quick and easy to catch up with an old high-school friend, see your new niece or nephew before you can travel across the country, register your feelings on political news or plan who's bringing what to the neighborhood barbecue. While you're there, Facebook gets eyeballs to advertise to, which gives the company ample incentive to keep you on its platforms — and keep you coming back.

Unfortunately, Haugen contends, the algorithms that are constantly keeping us tuned in are also pushing us toward more extreme content. She said mainstream political parties have complained they need to post more and more aggressive positions, as the algorithms find content that generates the angriest responses, shares and comments is the most reliable at keeping users online. Facebook's systems, she said, also prioritize the kinds of fabulous-lifestyle posts on Instagram that tend to make teen users feel unhappy by comparison. And algorithmic amplification has long played a role in making wild falsehoods go viral online.

The solution, Haugen said, includes amending Section 230 — the legal provision that shields online platforms from liability over what users post — so that companies like Facebook have to share in some legal responsibility for what their algorithms promote. She also talked about slowing down the sharing of news by prompting users to read articles before sharing them, as Twitter now does. Ideally, she said, Facebook would return to using a more chronological timeline, showing users content mostly because it's recent, not because it makes them want to leave angry comments, which in turn pushes others to respond with fury.

Haugen compared Facebook to a car — which seems to be the chosen metaphor even of its defenders these days — pointing out state and federal regulators have pretty strict rules for automobiles, which rely on a ton of insight and access into the actual workings of the machines we put on the road. But the metaphor has even more resonance: Haugen was, in essence, calling for the installation of rumble strips and stop signs around misinformation, while allowing people to zoom more quickly down the highway of social media when they're sharing recipes with grandma or connecting with other cancer survivors. Information around kids' health would travel more slowly, the same way we put literal speed restrictions in front of actual schools.

Everything from our kids' mental health to our society's ability to confront COVID-19 and work across political divisions is at stake, Haugen said, adding that the changes don't have to be the end of Facebook or its revenue. "A lot of the changes that I'm talking about are not going to make Facebook an unprofitable company," she said at one point. "It just won't be a ludicrously profitable company."

Interestingly, Haugen said she did not agree with calls to break up Facebook, arguing that its profits, particularly from advertising on Instagram, supported the much-needed research into the effects of the company's algorithm on society.

In addition to changing the incentives that Facebook uses to keep us devoted to social media — regardless of whether the content it pushes is bad for society — Haugen stressed the importance of transparency. She called for the government to establish a new regulatory body with oversight of Facebook and for more opportunities for independent researchers to figure out if the company is truly living up to its public statements to users, investors and lawmakers.

Members of the committee listened, far more respectfully than they often do in such hearings, to Haugen's prescriptions.

Democratic Sen. Richard Blumenthal, who led the hearing, suggested the U.S. Federal Trade Commission and Securities and Exchange Commission should already be taking up the issue of any potential lies under existing authorities right now.

"Facebook appears to have misled the public and investors, and if that's correct, it ought to face real penalties," he said.

And in the name of further transparency, Blumenthal urged Mark Zuckerberg to come testify, yet again, to answer Haugen's claims.

A MESSAGE FROM ALIBABA

www.protocol.com

The future of retail is digital, experiential – and happening now in China. U.S. businesses are going digital and using Alibaba to create immersive experiences to sell to the 900 million Chinese consumers on Alibaba's ecommerce platforms.

LEARN MORE

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep ReadingShow less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep ReadingShow less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep ReadingShow less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep ReadingShow less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep ReadingShow less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins