Facebook whistleblower Frances Haugen had her day in Congress on Tuesday. In the process, she prompted many lawmakers, who have so far failed to rein in the platform for several years, to say they were newly determined to do something about the company.
Although Haugen's information had contributed greatly to a series of damaging reports on Facebook, her testimony on Tuesday added more details and went even further in confirming company practices and putting data on existing concerns.
Here were the top revelations:
- Haugen addressed social media addiction, saying Facebook studies a metric called "problematic use," which occurs when users report they can't control their use even when it materially hurts their health, school work or other aspects of their lives. "[5%] to 6% of 14-year-olds have the self-awareness to admit [to] both those questions," Haugen said, adding that the peak of such self-reports occurred at that age. She suggested, however, that those figures underestimate the true scale of the problem.
- The hearing focused especially on the safety of kids and teens, and many of Haugen's revelations zeroed in on the topic. She was quick to point out she didn't work specifically on those teams, and Facebook attempted to discredit her testimony on that basis. She made clear, however, that she directly drew her conclusions from Facebook's own research. For example, Facebook likes to claim that kids under 13 aren't on the platform, simply because they're not allowed to be — even as the company touts its success in removing tweens and young kids. But, Haugen said, the company can make good guesses about how many kids are on the site who shouldn't be: Research "discovers things like up to 10[%] to 15% of 10-year-olds… may be on Facebook or Instagram."
- Outside organizations, researchers and even lawmakers who have tried to study how Facebook affects users say that Instagram pushed pro-anorexia content to test accounts purporting to be teens. As part of what's called "proactive incident response," Facebook does its own internal tests on these issues, Haugen said. "They have literally re-created that experiment themselves and confirmed, yes, this happens to people. Facebook knows that they are leading young users to anorexia content."
- Even when Facebook has turned on artificial intelligence to curtail certain kinds of content, Haugen said, the systems have a poor track record of actually identifying posts on topics such as COVID-19 misinformation: "It's still in the raw form for 80[%], 90% of even that sensitive content."
- Haugen alleged Facebook misled advertisers who were concerned, in the wake of the George Floyd protests last summer and the insurrection at the Capitol on Jan. 6, that their content might end up near problematic posts. "Facebook said in their talking points that they gave to advertisers, 'We're doing everything in our power to make this safer,' or, 'We take down all the hate speech when we find it,'" she said. "That was not true. They get 3[%] to 5% of hate speech."
- The stereotype of the lonely older user who gets tricked by misinformation has some truth to it, Haugen said. "Facebook knows that the people who are exposed to the most misinformation are people who are recently widowed, divorced, moved to a new city [or] are isolated in some other way."
- Haugen said several times that teams devoted to policing controversial content were understaffed, which she said created "implicit discouragement from having better detection systems" for the kinds of content the teams are supposed to monitor. She spoke especially about serving on a counterespionage team, saying: "At any given time, our team could only handle a third of the cases that we knew about. We knew that if we built even a basic detector, we would likely have many more cases."
- Haugen also said she's "speaking to other parts of Congress" about issues she tracked as part of the counterespionage team, such as Chinese government surveillance of Uighurs. She added: "Facebook's very aware that this is happening on the platform."
Make it worse to make it better
In response to the parade of damaging claims, members of the committee asked Haugen to outline potential fixes for the platform's many problems. Her overall message? We might need to make Facebook a little worse to make it a little better.
Platforms, including Facebook, have long tried to make user actions as effortless as possible — they remove "friction." But some friction, Haugen suggested, may be a good thing, and perhaps we need to slow down some of the things Facebook has long tried to make faster and easier for us.
On Facebook, it's quick and easy to catch up with an old high-school friend, see your new niece or nephew before you can travel across the country, register your feelings on political news or plan who's bringing what to the neighborhood barbecue. While you're there, Facebook gets eyeballs to advertise to, which gives the company ample incentive to keep you on its platforms — and keep you coming back.
Unfortunately, Haugen contends, the algorithms that are constantly keeping us tuned in are also pushing us toward more extreme content. She said mainstream political parties have complained they need to post more and more aggressive positions, as the algorithms find content that generates the angriest responses, shares and comments is the most reliable at keeping users online. Facebook's systems, she said, also prioritize the kinds of fabulous-lifestyle posts on Instagram that tend to make teen users feel unhappy by comparison. And algorithmic amplification has long played a role in making wild falsehoods go viral online.
The solution, Haugen said, includes amending Section 230 — the legal provision that shields online platforms from liability over what users post — so that companies like Facebook have to share in some legal responsibility for what their algorithms promote. She also talked about slowing down the sharing of news by prompting users to read articles before sharing them, as Twitter now does. Ideally, she said, Facebook would return to using a more chronological timeline, showing users content mostly because it's recent, not because it makes them want to leave angry comments, which in turn pushes others to respond with fury.
Haugen compared Facebook to a car — which seems to be the chosen metaphor even of its defenders these days — pointing out state and federal regulators have pretty strict rules for automobiles, which rely on a ton of insight and access into the actual workings of the machines we put on the road. But the metaphor has even more resonance: Haugen was, in essence, calling for the installation of rumble strips and stop signs around misinformation, while allowing people to zoom more quickly down the highway of social media when they're sharing recipes with grandma or connecting with other cancer survivors. Information around kids' health would travel more slowly, the same way we put literal speed restrictions in front of actual schools.
Everything from our kids' mental health to our society's ability to confront COVID-19 and work across political divisions is at stake, Haugen said, adding that the changes don't have to be the end of Facebook or its revenue. "A lot of the changes that I'm talking about are not going to make Facebook an unprofitable company," she said at one point. "It just won't be a ludicrously profitable company."
Interestingly, Haugen said she did not agree with calls to break up Facebook, arguing that its profits, particularly from advertising on Instagram, supported the much-needed research into the effects of the company's algorithm on society.
In addition to changing the incentives that Facebook uses to keep us devoted to social media — regardless of whether the content it pushes is bad for society — Haugen stressed the importance of transparency. She called for the government to establish a new regulatory body with oversight of Facebook and for more opportunities for independent researchers to figure out if the company is truly living up to its public statements to users, investors and lawmakers.
Members of the committee listened, far more respectfully than they often do in such hearings, to Haugen's prescriptions.
Democratic Sen. Richard Blumenthal, who led the hearing, suggested the U.S. Federal Trade Commission and Securities and Exchange Commission should already be taking up the issue of any potential lies under existing authorities right now.
"Facebook appears to have misled the public and investors, and if that's correct, it ought to face real penalties," he said.
And in the name of further transparency, Blumenthal urged Mark Zuckerberg to come testify, yet again, to answer Haugen's claims.
A MESSAGE FROM ALIBABA
The future of retail is digital, experiential – and happening now in China. U.S. businesses are going digital and using Alibaba to create immersive experiences to sell to the 900 million Chinese consumers on Alibaba's ecommerce platforms.