Source Code: Your daily look at what matters in tech.

source-codesource codeauthorIssie LapowskyNoneWant your finger on the pulse of everything that's happening in tech? Sign up to get David Pierce's daily newsletter.64fd3cbe9f
×

Get access to Protocol

Will be used in accordance with our Privacy Policy

I’m already a subscriber
Protocol | Policy

Facebook is finally taking organized hate seriously

Last quarter, Facebook removed more organized hate content than foreign terrorist content for the first time in its history.

Facebook is finally taking organized hate seriously
Tech spent years fighting foreign terrorists. Then came the Capitol riot.
Photo: Roberto Schmidt/Getty Images

Facebook has historically cracked down on foreign terrorist threats more aggressively than domestic ones, but new data released by the company Wednesday suggests that might be changing.

Between January and March of 2021, Facebook said it took down more content related to organized hate groups than it did content related to terrorist organizations — the first time that has happened since Facebook began reporting on content violations and enforcement in late 2017.

Overall in the first quarter of 2020, Facebook removed 9.8 million pieces of organized hate content, up from 6.4 million in the last quarter of 2020. That's compared to 9 million pieces of terrorist content that were removed during the first quarter of this year, a slight increase from 8.6 million pieces of terrorist content removed in the last quarter of 2020. On Instagram, the company continued to remove more terrorist content than organized hate content, but the overall volume of content in both categories was significantly smaller than it was on Facebook.

While Facebook prohibits both organized hate and terrorism, enforcement against terrorist organizations has traditionally dwarfed enforcement against domestic hate groups. That's partly to do with governments, including the U.S. government, forcefully pushing social networks to banish foreign terrorist groups, like ISIS and Al Qaeda, while tiptoeing around domestic organizations such as the Oath Keepers or the Proud Boys. In the U.S., the threat posed by those homegrown groups has only recently been prioritized by government officials since the Capitol attack.

The disparity in enforcement is also partly to do with the diffuse, and in some ways, disorganized nature of so-called organized hate groups. "Groups that are less organized and less structured and don't put out official propaganda in the same sort of way, you have to use a different tool kit in order to get at those kinds of entities," Brian Fishman, who leads Facebook's work fighting dangerous individuals and organizations, recently told Protocol.

In the run up to the 2020 election, Facebook began ramping up its efforts to crack down on organized hate groups and militias, banning individuals and groups that had previously had free rein on the platform. Those policies were full of holes, and leaders of those militia groups have since been arrested for storming the Capitol on Jan. 6 as part of schemes that court records show were largely planned on Facebook-owned platforms.

Still, the policy updates, coupled with improved algorithmic detection, appear to have had an impact. "These are improvements in our technology that continue to improve how proactive we are in detecting more violating content," Guy Rosen, Facebook's vice president of integrity, said on a call with reporters.

The uptick in enforcement against organized hate groups tracks with an increase in automated enforcement against hate speech writ large. While Facebook removed slightly less hate speech in the first quarter of 2021 than it did in the last quarter of 2020, actual views of that content decreased. Facebook estimates that in the first quarter, users saw hate speech five or six times per every 10,000 views of content. That's down from about seven or eight views of hate speech for every 10,000 views in the fourth quarter of 2020.

While Facebook's efforts to combat hate speech and hate groups may be progressing, Facebook's enforcement against other particularly sensitive categories of violating content suffered significant setbacks. According to Rosen, Facebook detected two technical issues that interfered with its detection of child sexual abuse material in the fourth quarter of 2020 and the first quarter of 2021. Enforcement against that type of content dropped from 12.4 million pieces of content in the third quarter of 2020 to less than half of that in the fourth quarter of 2020 and first quarter of 2021.

Rosen said the company is in the "process of addressing that and going back retroactively to remove and take action" on any violations Facebook might have missed.

In 2020, Facebook said it also saw a staggering increase in the number of content restrictions required by governments worldwide in order to comply with local laws. According to the report, those requests nearly doubled from 22,120 in the first half of 2020 to 42,606 in the second "driven mainly by increases in requests from the UK, Turkey and Brazil."

Latest Stories