yesIssie LapowskyNone
×

Get access to Protocol

I’ve already subscribed

Will be used in accordance with our Privacy Policy

Facebook’s new content moderation report only proves the case of its moderators

The company's ability to spot violating posts is back to pre-pandemic levels, just as its moderators start heading back to the office.

Facebook's office in Menlo Park

Dozens of Facebook moderators signed an open letter this week to Mark Zuckerberg and others that criticizes recent orders that they return to the office despite a surge in COVID-19 cases and demands they be made full-time employees.

Image: AFP/Getty Images

Facebook's decision to send content moderators home in March had devastating consequences for the company's ability to catch and remove posts containing the most harmful content. Now, as some moderators have returned to the office in recent months, things are getting back to normal — underlining the importance of having humans in the loop.

That's according to Facebook's third-quarter transparency report, published Thursday. It shows, for instance, that in the third quarter of this year, Instagram removed nearly twice as much child sexual abuse material and nearly five times as much suicide-related content as it did in the second quarter.

This dramatic shift underscores just how crucial this global army of moderators is to the way the world's largest social media platform operates.

"People are an important part of the equation for content enforcement," Guy Rosen, Facebook's vice president of integrity, said on a call with reporters Thursday. "These are incredibly important workers who do an incredibly important part of this job ... The reason we're bringing some workers back into offices is exactly to ensure that we can have that balance of both people and AI working on these areas."

Facebook's report comes just one day after dozens of moderators signed an open letter to Facebook CEO Mark Zuckerberg and others that criticizes recent orders that they return to the office despite a surge in COVID-19 cases and demands that they be made full-time employees. "By outsourcing our jobs, Facebook implies that the 35,000 of us who work in moderation are somehow peripheral to social media," the letter read. "Yet we are so integral to Facebook's viability that we must risk our lives to come into work."

Rosen stressed that the majority of content moderators are still working from home, but said that those who have gone back to the office are doing so in spaces with reduced capacity, physical distancing, mandatory temperature checks and other safety precautions "to ensure that we're providing a safe workspace for them to do this incredibly important work to keep our community safe as well."

The moderators argued that's not enough, and are pushing Facebook to guarantee them things like "real healthcare," hazard pay and the ability to continue working from home if they live with at-risk individuals.

Facebook's executives credited many of the gains they made this quarter to their investment in automated systems. That includes their ability to proactively detect 95% of hate speech on the platform. When Facebook first began reporting this stat in 2017, just 23.6% of hate speech was proactively detected before users reported it. Facebook also reported the prevalence of hate speech on the platform — that is, the percentage of times people actually see hate speech while using Facebook — for the first time. They found that prevalence was 0.1% to 0.11%, suggesting for every 10,000 views on Facebook, about 10 or 11 of them contain hate speech.

Despite these advances, Facebook's chief technology officer Mike Schroepfer acknowledged that automated filters will never replace the work of human moderators. "I don't see any short-term reduction or long-term reduction in the human involvement in this," he said on the call. "We get faster, more accurate, more powerful and then we can use our amazing staff we have to work on the more nuanced problems we have that really require human review."

People

Beeper built the universal messaging app the world needed

It's an app for all your social apps. And part of an entirely new way to think about chat.

Beeper is an app for all your messaging apps, including the hard-to-access ones.

Image: Beeper

Eric Migicovsky likes to tinker. And the former CEO of Pebble — he's now a partner at Y Combinator — knows a thing or two about messaging. "You remember on the Pebble," he asked me, "how we had this microphone, and on Android you could reply to all kinds of messages?" Migicovsky liked that feature, and he especially liked that it didn't care which app you used. Android-using Pebble wearers could speak their replies to texts, Messenger chats, almost any notification that popped up.

That kind of universal, non-siloed approach to messaging appealed to Migicovsky, and it didn't really exist anywhere else. "Remember Trillian from back in the day?" he asked, somewhat wistfully. "Or Adium?" They were the gold-standard of universal messaging apps; users could log in to their AIM, MSN, GChat and Yahoo accounts, and chat with everyone in one place.

Keep Reading Show less
David Pierce

David Pierce ( @pierce) is Protocol's editor at large. Prior to joining Protocol, he was a columnist at The Wall Street Journal, a senior writer with Wired, and deputy editor at The Verge. He owns all the phones.

Politics

Facebook’s Oversight Board won’t save it from the Trump ban backlash

The Board's decision on whether to reinstate Trump could set a new precedent for Facebook. But does the average user care what the Board has to say?

A person holds a sign during a Free Speech Rally against tech companies, on Jan. 20 in California.

Photo: Valerie Macon/Getty Images

Two weeks after Facebook suspended former President Donald Trump's account indefinitely, Facebook answered a chorus of calls and referred the case to its newly created Oversight Board for review. Now, the board has 90 days to make a call as to whether Trump stays or goes permanently. The board's decision — and more specifically, how and why it arrives at that decision — could have consequences not only for other global leaders on Facebook, but for the future of the Board itself.

Facebook created its Oversight Board for such a time as this — a time when it would face a controversial content moderation decision and might need a gut check. Or a fall guy. There could be no decision more controversial than the one Facebook made on Jan. 7, when it decided to muzzle one of the most powerful people in the world with weeks remaining in his presidency. It stands to reason, then, that Facebook would tap in its newly anointed refs on the Oversight Board both to earnestly review the call and to put a little distance between Facebook and the decision.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Politics

This is the future of the FTC

President Joe Biden has named Becca Slaughter acting chair of the FTC. In conversation with Protocol, she laid out her priorities for the next four years.

FTC commissioner Becca Slaughter may be President Biden's pick for FTC chair.

Photo: David Becker/Getty Images

Becca Slaughter made a name for herself last year when, as a commissioner for the Federal Trade Commission, she breastfed her newborn baby during video testimony before the Senate, raising awareness about the plight of working parents during the pandemic.

But on Thursday, Slaughter's name began circulating for other reasons: She was just named as President Joe Biden's pick for acting chair of the FTC, an appointment that puts Slaughter at the head of antitrust investigations into tech giants, including Facebook.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Politics

The other reason Facebook silenced Trump? Republicans lost power.

Yes, the president's acts were unprecedented. But Facebook is also preparing for a new Washington, controlled by Democrats.

Mark Zuckerberg and Facebook's head of public policy Joel Kaplan have spent four years bending to conservatives' demands. Now, Facebook is bending in a new direction.

Photo: Samuel Corum/Getty Images

In his post announcing that President Trump would be blocked from posting on Facebook until at least Inauguration Day, Mark Zuckerberg wrote that the president's incitement of the violent mob that stormed the U.S. Capitol building Wednesday was "fundamentally different" than any of the offenses he's committed on Facebook before. "The risks of allowing the President to continue to use our service during this period are simply too great," he wrote on Thursday.

That may be true. But there's another reason why — after four years spent insisting that a tech company has no business shutting up the president of the United States, no matter how much he threatens to shoot protesters or engages in voter suppression — Zuckerberg finally had a change of heart: Republicans just lost power.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Power

Pressure mounts on tech giants to ban Trump, as rioters storm Capitol

Facebook, Twitter and YouTube removed a video in which Trump expressed love for the rioters, but none of the companies have banned him outright — yet.

Twitter locked President Trump's account.

Image: Twitter

Twitter, Facebook and YouTube took action against several of President Trump's posts Wednesday, labeling the posts, limiting reshares and removing a video in which President Trump expressed his love for rioters who stormed the U.S. Capitol building, leading to the evacuation of the Senate, the deployment of the National Guard and to one person being shot and killed. Twitter locked President Trump's account, requiring him to remove three tweets and saying that his account would remain locked for 12 hours after those tweets were removed. Twitter also warned that any future violations would get him banned. Facebook also locked his account for 24 hours, citing "two policy violations." These actions followed a day of calls from tech investors, academics and others to kick Trump off of their platforms once and for all.

In an early tweet, University of Virginia law professor Danielle Citron implored Twitter CEO Jack Dorsey to take action. "As someone who has served on your Trust and Safety Board since its inception and counseled you since 2009, time is now to suspend President Trump's account," Citron wrote. "He has deliberately incited violence, causing mayhem with his lies and threats."

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Latest Stories