yesIssie LapowskyNone
×

Get access to Protocol

I’ve already subscribed

Will be used in accordance with our Privacy Policy

Power

How COVID-19 helped — and hurt — Facebook’s fight against bad content

The amount of child sexual abuse material Instagram caught and removed fell dramatically, while hate speech removals on Facebook and Instagram grew.

Mark Zuckerberg looking sad

Facebook's data shows that the pandemic made content-moderation systems both better and worse.

Photo: Drew Angerer/Getty Images

When Facebook sent home its content moderators in March due to the COVID-19 pandemic, announcing it would rely on automation to at least temporarily do their job, the company predicted the decision would have a major impact on its ability to find and remove content that violates its policies. Now, according to newly released data from Facebook, we know just how big an impact it had.

During the second quarter of 2020 the company removed less than half of the child sexual abuse material from Instagram that it did the quarter before — not because there was less of it, but because the company was less equipped to catch it. And on both Facebook and Instagram, the amount of suicide and self-injury content it removed dropped precipitously too. On Instagram, it fell from 1.3 million pieces of suicide content removed last quarter to just 275,000 pieces this quarter.

But in other categories, like hate speech, Facebook's new reliance on automated systems actually led to a drastic increase in removals, from just 9.6 million pieces of hate speech removed from Facebook in the beginning of 2020 to 22.5 million pieces removed between April and June.

The drop in the removal of child sexual abuse material from Instagram wasn't due to a decrease in the amount of it on the platform. Neither was the decrease in takedowns of suicide related content. It was due to the limited number of human beings who were available to look at those posts, since, initially at least, they were all working from home. "The reason this content is challenging is because it's graphic content that, at home, is very hard for people to moderate," said Guy Rosen, Facebook's vice president of integrity. "We want to be very careful with the environment that people have in order to look at that content."

It's not that the human reviewers are required to spot all child sexual abuse material. Automated systems are already responsible for removing 97.5% of those types of posts that appear on Facebook. But according to Facebook spokesperson Emily Cain, human reviewers are critical when it comes to "banking" child sexual abuse material. That is, taking known images and logging them so that Facebook's AI systems can then go find and remove them.

"Without humans banking this content then our machines can't find it at scale," Cain said. "And this compounds after a while, so our content-actioned numbers decreased."

"Overall, this pandemic and this situation really reinforced to us that it is always people and technology working together," Rosen said on a call with reporters Tuesday. "We always need people who look and measure and help tune our automation to ensure that we're always up to speed and always up to date with how content is evolving."

The decrease in content removal is a blow to Facebook's ongoing efforts to fight the spread of child sexual abuse material on the platform at a time when the National Center for Missing and Exploited Children says that it's seeing an exponential increase in the number of reports about child exploitation. That said, the company did manage to remove more pieces of child sexual abuse material from the Facebook app than it did last quarter. And yet, overall, in 2020, removals in that category are down significantly from where they were at the end of last year.

During the COVID-19 crisis, Rosen said Facebook has developed a ranking system to prioritize the most critical content in these sensitive categories. That might include anything from a live video to a post in which someone indicates they plan to harm themselves imminently. This ranking system was already in the works before COVID-19, but Rosen said the company expedited its development in response to the crisis.

"This enables our teams to spend their time on the cases where we need their expertise the most, and it means there will be a shift towards more content being initially actioned by our automated systems," Rosen said.

As for the sharp increase in the amount of hate speech being removed from the platform, Rosen attributed that, too, to the ongoing development of Facebook's AI systems. Because hate speech is less graphic than, say, a video of child abuse, moderators are more able to handle that content remotely. As Facebook's chief technology officer Mike Schroepfer told Protocol in a tweet, "The more … sensitive and nuanced the content the more we need help from people."

Of course, the perennial question about hate speech, child sexual abuse material and other types of problematic content is not just how much Facebook is taking down and how fast, but how prevalent that content is to begin with. On the subject of hate speech, that's a question that Facebook hasn't been able to answer yet. Turns out, measuring prevalence requires a lot of human input, too.

Martin Cooper with his original DynaTAC cell phone.

Photo: Ted Soqui/Getty Images

Martin Cooper helped invent one of the most consequential and successful products in history: the cell phone. And almost five decades after he made the first public cell phone call, on a 2-pound brick of a device called the DynaTAC, he's written a book about his career called "Cutting the Cord: The Cell Phone Has Transformed Humanity." In it he tells the story of the cell phone's invention, and looks at how it has changed the world and will continue to do so.

Cooper came on the Source Code Podcast to talk about his time at Motorola, the process of designing the first-ever cell phone, whether today's tech giants are monopolies and why he's bullish on the future of AI.

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editor at large. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.

Politics

Is this the future of the FTC?

In conversation with Protocol, Commissioner Becca Slaughter, whose name has been floated as a possible FTC chair, laid out her priorities for the next four years.

FTC commissioner Becca Slaughter may be President Biden's pick for FTC chair.

Photo: David Becker/Getty Images

Becca Slaughter, a commissioner for the Federal Trade Commission, made a name for herself last year when she famously breastfed her newborn baby during video testimony before the Senate, raising awareness about the plight of working parents during the pandemic.

But lately, Slaughter's name has been circulating for other reasons: She's a likely candidate to be President Joe Biden's pick for FTC chair, an appointment that would put Slaughter at the head of antitrust investigations into tech giants, including Facebook.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Politics

The other reason Facebook silenced Trump? Republicans lost power.

Yes, the president's acts were unprecedented. But Facebook is also preparing for a new Washington, controlled by Democrats.

Mark Zuckerberg and Facebook's head of public policy Joel Kaplan have spent four years bending to conservatives' demands. Now, Facebook is bending in a new direction.

Photo: Samuel Corum/Getty Images

In his post announcing that President Trump would be blocked from posting on Facebook until at least Inauguration Day, Mark Zuckerberg wrote that the president's incitement of the violent mob that stormed the U.S. Capitol building Wednesday was "fundamentally different" than any of the offenses he's committed on Facebook before. "The risks of allowing the President to continue to use our service during this period are simply too great," he wrote on Thursday.

That may be true. But there's another reason why — after four years spent insisting that a tech company has no business shutting up the president of the United States, no matter how much he threatens to shoot protesters or engages in voter suppression — Zuckerberg finally had a change of heart: Republicans just lost power.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Power

Pressure mounts on tech giants to ban Trump, as rioters storm Capitol

Facebook, Twitter and YouTube removed a video in which Trump expressed love for the rioters, but none of the companies have banned him outright — yet.

Twitter locked President Trump's account.

Image: Twitter

Twitter, Facebook and YouTube took action against several of President Trump's posts Wednesday, labeling the posts, limiting reshares and removing a video in which President Trump expressed his love for rioters who stormed the U.S. Capitol building, leading to the evacuation of the Senate, the deployment of the National Guard and to one person being shot and killed. Twitter locked President Trump's account, requiring him to remove three tweets and saying that his account would remain locked for 12 hours after those tweets were removed. Twitter also warned that any future violations would get him banned. Facebook also locked his account for 24 hours, citing "two policy violations." These actions followed a day of calls from tech investors, academics and others to kick Trump off of their platforms once and for all.

In an early tweet, University of Virginia law professor Danielle Citron implored Twitter CEO Jack Dorsey to take action. "As someone who has served on your Trust and Safety Board since its inception and counseled you since 2009, time is now to suspend President Trump's account," Citron wrote. "He has deliberately incited violence, causing mayhem with his lies and threats."

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Politics

Here’s how Big Tech is preparing for regulations in 2021

Companies know that the heat is only going to increase this year.

2021 promises to be a turbulent year for Big Tech.

Photo: Ting Shen/Getty Images

The open internet. Section 230. China. Internet access. 5G. Antitrust. When we asked the policy shops at some of the biggest and most powerful tech companies to identify their 2021 policy priorities, these were the words they had in common.

Each of these issues centers around a common theme. "Despite how tech companies might feel, they've been enjoying a very high innovation phase. They're about to experience a strong regulation phase," said Erika Fisher, Atlassian's general counsel and chief administrative officer. "The question is not if, but how that regulation will be shaped."

Keep Reading Show less
Anna Kramer

Anna Kramer is a reporter at Protocol (@ anna_c_kramer), where she helps write and produce Source Code, Protocol's daily newsletter. Prior to joining the team, she covered tech and small business for the San Francisco Chronicle and privacy for Bloomberg Law. She is a recent graduate of Brown University, where she studied International Relations and Arabic and wrote her senior thesis about surveillance tools and technological development in the Middle East.

Latest Stories