Protocol | Policy

Facebook is finally taking organized hate seriously

Last quarter, Facebook removed more organized hate content than foreign terrorist content for the first time in its history.

Facebook is finally taking organized hate seriously
Tech spent years fighting foreign terrorists. Then came the Capitol riot.
Photo: Roberto Schmidt/Getty Images

Facebook has historically cracked down on foreign terrorist threats more aggressively than domestic ones, but new data released by the company Wednesday suggests that might be changing.

Between January and March of 2021, Facebook said it took down more content related to organized hate groups than it did content related to terrorist organizations — the first time that has happened since Facebook began reporting on content violations and enforcement in late 2017.

Overall in the first quarter of 2020, Facebook removed 9.8 million pieces of organized hate content, up from 6.4 million in the last quarter of 2020. That's compared to 9 million pieces of terrorist content that were removed during the first quarter of this year, a slight increase from 8.6 million pieces of terrorist content removed in the last quarter of 2020. On Instagram, the company continued to remove more terrorist content than organized hate content, but the overall volume of content in both categories was significantly smaller than it was on Facebook.

While Facebook prohibits both organized hate and terrorism, enforcement against terrorist organizations has traditionally dwarfed enforcement against domestic hate groups. That's partly to do with governments, including the U.S. government, forcefully pushing social networks to banish foreign terrorist groups, like ISIS and Al Qaeda, while tiptoeing around domestic organizations such as the Oath Keepers or the Proud Boys. In the U.S., the threat posed by those homegrown groups has only recently been prioritized by government officials since the Capitol attack.

The disparity in enforcement is also partly to do with the diffuse, and in some ways, disorganized nature of so-called organized hate groups. "Groups that are less organized and less structured and don't put out official propaganda in the same sort of way, you have to use a different tool kit in order to get at those kinds of entities," Brian Fishman, who leads Facebook's work fighting dangerous individuals and organizations, recently told Protocol.

In the run up to the 2020 election, Facebook began ramping up its efforts to crack down on organized hate groups and militias, banning individuals and groups that had previously had free rein on the platform. Those policies were full of holes, and leaders of those militia groups have since been arrested for storming the Capitol on Jan. 6 as part of schemes that court records show were largely planned on Facebook-owned platforms.

Still, the policy updates, coupled with improved algorithmic detection, appear to have had an impact. "These are improvements in our technology that continue to improve how proactive we are in detecting more violating content," Guy Rosen, Facebook's vice president of integrity, said on a call with reporters.

The uptick in enforcement against organized hate groups tracks with an increase in automated enforcement against hate speech writ large. While Facebook removed slightly less hate speech in the first quarter of 2021 than it did in the last quarter of 2020, actual views of that content decreased. Facebook estimates that in the first quarter, users saw hate speech five or six times per every 10,000 views of content. That's down from about seven or eight views of hate speech for every 10,000 views in the fourth quarter of 2020.

While Facebook's efforts to combat hate speech and hate groups may be progressing, Facebook's enforcement against other particularly sensitive categories of violating content suffered significant setbacks. According to Rosen, Facebook detected two technical issues that interfered with its detection of child sexual abuse material in the fourth quarter of 2020 and the first quarter of 2021. Enforcement against that type of content dropped from 12.4 million pieces of content in the third quarter of 2020 to less than half of that in the fourth quarter of 2020 and first quarter of 2021.

Rosen said the company is in the "process of addressing that and going back retroactively to remove and take action" on any violations Facebook might have missed.

In 2020, Facebook said it also saw a staggering increase in the number of content restrictions required by governments worldwide in order to comply with local laws. According to the report, those requests nearly doubled from 22,120 in the first half of 2020 to 42,606 in the second "driven mainly by increases in requests from the UK, Turkey and Brazil."

Power

How the creators of Spligate built gaming’s newest unicorn

1047 Games is now valued at $1.5 billion after three rounds of funding since May.

1047 Games' Splitgate amassed 13 million downloads when its beta launched in July.

Image: 1047 Games

The creators of Splitgate had a problem. Their new free-to-play video game, a take on the legendary arena shooter Halo with a teleportation twist borrowed from Valve's Portal, was gaining steam during its open beta period in July. But it was happening too quickly.

Splitgate was growing so fast and unexpectedly that the entire game was starting to break, as the servers supporting the game began to, figuratively speaking, melt down. The game went from fewer than 1,000 people playing it at any given moment in time to suddenly having tens of thousands of concurrent players. Then it grew to hundreds of thousands of players, all trying to log in and play at once across PlayStation, Xbox and PC.

Keep Reading Show less
Nick Statt
Nick Statt is Protocol's video game reporter. Prior to joining Protocol, he was news editor at The Verge covering the gaming industry, mobile apps and antitrust out of San Francisco, in addition to managing coverage of Silicon Valley tech giants and startups. He now resides in Rochester, New York, home of the garbage plate and, completely coincidentally, the World Video Game Hall of Fame. He can be reached at nstatt@protocol.com.

While it's easy to get lost in the operational and technical side of a transaction, it's important to remember the third component of a payment. That is, the human behind the screen.

Over the last two years, many retailers have seen the benefit of investing in new, flexible payments. Ones that reflect the changing lifestyles of younger spenders, who are increasingly holding onto their cash — despite reports to the contrary. This means it's more important than ever for merchants to take note of the latest payment innovations so they can tap into the savings of the COVID-19 generation.

Keep Reading Show less
Antoine Nougue,Checkout.com

Antoine Nougue is Head of Europe at Checkout.com. He works with ambitious enterprise businesses to help them scale and grow their operations through payment processing services. He is responsible for leading the European sales, customer success, engineering & implementation teams and is based out of London, U.K.

Protocol | Policy

Why Twitch’s 'hate raid' lawsuit isn’t just about Twitch

When is it OK for tech companies to unmask their anonymous users? And when should a violation of terms of service get someone sued?

The case Twitch is bringing against two hate raiders is hardly black and white.

Photo: Caspar Camille Rubin/Unsplash

It isn't hard to figure out who the bad guys are in Twitch's latest lawsuit against two of its users. On one side are two anonymous "hate raiders" who have been allegedly bombarding the gaming platform with abhorrent attacks on Black and LGBTQ+ users, using armies of bots to do it. On the other side is Twitch, a company that, for all the lumps it's taken for ignoring harassment on its platform, is finally standing up to protect its users against persistent violators whom it's been unable to stop any other way.

But the case Twitch is bringing against these hate raiders is hardly black and white. For starters, the plaintiff here isn't an aggrieved user suing another user for defamation on the platform. The plaintiff is the platform itself. Complicating matters more is the fact that, according to a spokesperson, at least part of Twitch's goal in the case is to "shed light on the identity of the individuals behind these attacks," raising complicated questions about when tech companies should be able to use the courts to unmask their own anonymous users and, just as critically, when they should be able to actually sue them for violating their speech policies.

Keep Reading Show less
Issie Lapowsky

Issie Lapowsky ( @issielapowsky) is Protocol's chief correspondent, covering the intersection of technology, politics, and national affairs. She also oversees Protocol's fellowship program. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University's Center for Publishing on how tech giants have affected publishing.

Protocol | Workplace

Remote work is here to stay. Here are the cybersecurity risks.

Phishing and ransomware are on the rise. Is your remote workforce prepared?

Before your company institutes work-from-home-forever plans, you need to ensure that your workforce is prepared to face the cybersecurity implications of long-term remote work.

Photo: Stefan Wermuth/Bloomberg via Getty Images

The delta variant continues to dash or delay return-to-work plans, but before your company institutes work-from-home-forever plans, you need to ensure that your workforce is prepared to face the cybersecurity implications of long-term remote work.

So far in 2021, CrowdStrike has already observed over 1,400 "big game hunting" ransomware incidents and $180 million in ransom demands averaging over $5 million each. That's due in part to the "expanded attack surface that work-from-home creates," according to CTO Michael Sentonas.

Keep Reading Show less
Michelle Ma
Michelle Ma (@himichellema) is a reporter at Protocol, where she writes about management, leadership and workplace issues in tech. Previously, she was a news editor of live journalism and special coverage for The Wall Street Journal. Prior to that, she worked as a staff writer at Wirecutter. She can be reached at mma@protocol.com.
Protocol | Fintech

When COVID rocked the insurance market, this startup saw opportunity

Ethos has outraised and outmarketed the competition in selling life insurance directly online — but there's still an $887 billion industry to transform.

Life insurance has been slow to change.

Image: courtneyk/Getty Images

Peter Colis cited a striking statistic that he said led him to launch a life insurance startup: One in twenty children will lose a parent before they turn 15.

"No one ever thinks that will happen to them, but that's the statistics," the co-CEO and co-founder of Ethos told Protocol. "If it's a breadwinning parent, the majority of those families will go bankrupt immediately, within three months. Life insurance elegantly solves this problem."

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Signal at (510)731-8429.

Latest Stories