yesAndrea PetersonNone
×

Get access to Protocol

I’ve already subscribed

Will be used in accordance with our Privacy Policy

Politics

Facebook said what about content regulations?

A new white paper proposes "a way forward" on content rules.

Vera Jourova and Mark Zuckerberg

Facebook released a 22-page white paper in tandem with Mark Zuckerberg's meeting with EU regulators.

Photo: AFP via Getty Images

Facebook knows more rules are coming on harmful content, so it's doing what it can to shape them.

In a 22-page white paper released Monday in tandem with Mark Zuckerberg's meeting with EU regulators, Facebook's VP of content policy Monika Bickert wrestles with big questions about speech online and lays out a policy roadmap for regulating content — a roadmap that borrows from the company's existing procedures, including transparency reports and channels for users to report content. Except now, Facebook is encouraging governments to adopt similar policies as regulations, along with a new liability model, as it pushes for global standards for managing harmful content. It's a shift that would make it easier for the company to navigate a world of fractured national requirements.

Get what matters in tech, in your inbox every morning. Sign up for Source Code.

It's "more of a summation of the thinking and conversations about this topic that have been happening for several years than anything else," said Renée DiResta, research manager at the Stanford Internet Observatory.

"I don't see much that's new," she added.

As suggested by the paper's timing, it's targeted at the very regulators Zuckerberg is attempting to charm right now.

"It's important to understand that this white paper is talking about regulation outside the U.S.," said St. John's Law professor Kate Klonick.

Several countries, including Germany and Australia, have enacted laws with stiff penalties for companies that fail to remove certain content within certain timeframes, drawing complaints from companies including Facebook.

Here are four key highlights from the white paper — and what they mean:

'Preserving free expression'

"In the United States, for example, the First Amendment protects a citizen's ability to engage in dialogue online without government interference except in the narrowest of circumstances. Citizens in other countries often have different expectations about freedom of expression, and governments have different expectations about platform accountability. Unfortunately, some of the laws passed so far do not always strike the appropriate balance between speech and harm, unintentionally pushing platforms to err too much on the side of removing content." (Page 4)

The paper opens with an extended meditation on free expression and acknowledges that private platforms like itself are "increasingly" the ones making determinations about what speech is allowable online. But the section above flags that Facebook is most concerned about regulation outside the U.S., while throwing shade at some established laws.

Without explicitly addressing them, the paper is "clearly a response" to two laws, Klonick said: the German Network Enforcement Act (or NetzDG) and the United Kingdom's currently in-process Online Harms regulations.

NetzDG is aimed at limiting the spread of hate speech, which is illegal in Germany. It was passed by Germany's Bundestag in 2017 — and Facebook already faced fines for allegedly violating it. The law includes a 24-hour removal requirement for material that breaks German hate speech law and penalties of up 50 million euros for violations.

However, human rights activists also criticized that law for being over-broad and potentially infringing on freedom of expression rights. Civil liberties advocates are also worried about censorship when it comes to proposed U.K. regulation.

'Systems and procedures' vs. 'performance targets'

"By requiring systems such as user-friendly channels for reporting content or external oversight of policies or enforcement decisions, and by requiring procedures such as periodic public reporting of enforcement data, regulation could provide governments and individuals the information they need to accurately judge social media companies' efforts." (Page 9)

When it comes to regulatory models, Facebook argues in favor of an approach that requires various "systems and procedures." If that sounds familiar, it's because they are already built into Facebook's current operations.

The company also makes a case against establishing "performance targets" for companies, such as fines if companies don't keep harmful content below a certain threshold. That approach could create "perverse incentives," the company argues, and lead to procedures that do more to juice the required metrics than actually stop the spread of harmful content.

For example, the company argues that pushing for removal within 24 hours, as the German law does, could make it so companies don't proactively search for and remove offending comments that are outside that timeframe to appear better in reporting.

'Harmful content'

"Among types of content most likely to be outlawed, such as content supporting terrorism and content related to the sexual exploitation of children, there is significant variance in laws and enforcement. In some jurisdictions, for instance, images of child sexual exploitation are lawful if they are computer generated. In others, praise of terror groups is permitted, and lists of proscribed terror groups vary widely by country and region. Laws vary even more in areas such as hate speech, harassment, or misinformation. Many governments are grappling with how to approach the spread of misinformation, but few have outlawed it." (Page 17)

One big issue Facebook has come up against is that laws already on the books aren't very specific about what it means for content harmful or the kind of remedy companies should deliver. For example, Australia passed a law in the aftermath of the Christchurch shooting that called on social media companies to remove "abhorrent violent material," defined as videos of rapes, murders, and terrorist attacks — but the timeline for that removal, "expeditiously," is open to a lot of interpretation.

Ultimately, rather than talk about specific content that should be regulated, Facebook argues that governments should consider a few issues before deciding how to approach such content. Among them, that their rules could "be enforced practically, at scale, with limited context about the speaker and content" and take into account the nature of the content (private vs. public, permanent vs. ephemeral) while providing "flexibility so that platforms can adapt policies to emerging language trends."

What's next?

"Any national regulatory approach to addressing harmful content should respect the global scale of the internet and the value of cross-border communications. It should aim to increase interoperability among regulators and regulations. However, governments should not impose their standards onto other countries' citizens through the courts or any other means." (Page 19)

The paper builds on a point CEO Mark Zuckerburg made in a Washington Post op-ed last year, when he argued "we need a more standardized approach" to combating harmful content like hate and terrorist speech online. And while the policy content may not be a revelation, its release makes clear the direction Facebook hopes larger global conversations about content moderation will go: toward an integrated system that acknowledges the reality that global platforms will inevitably help drive.

Some in the tech industry seem on board. For example, Twitter Director of Public Policy Strategy Nick Pickles thanked Facebook for the report — naturally, via tweet — calling it "an important contribution to the debate on tech regulation."

However, the European lawmakers Facebook was likely trying to woo seem less enthused.

"It's not for us to adapt to those companies, but for them to adapt to us," said Europe's commissioner for internal markets Thierry Breton, according to a POLITICO report.

Microsoft wants to replace artists with AI

Better Zoom calls, simpler email attachments, smart iPhone cases and other patents from Big Tech.

Turning your stories into images.

Image: USPTO/Microsoft

Hello and welcome to 2021! The Big Tech patent roundup is back, after a short vacation and … all the things … that happened between the start of the year and now. It seems the tradition of tech companies filing weird and wonderful patents has carried into the new year; there are some real gems from the last few weeks. Microsoft is trying to outsource all creative endeavors to AI; Apple wants to make seat belts less annoying; and Amazon wants to cut down on some of the recyclable waste that its own success has inevitably created.

And remember: The big tech companies file all kinds of crazy patents for things, and though most never amount to anything, some end up defining the future.

Keep Reading Show less
Mike Murphy

Mike Murphy ( @mcwm) is the director of special projects at Protocol, focusing on the industries being rapidly upended by technology and the companies disrupting incumbents. Previously, Mike was the technology editor at Quartz, where he frequently wrote on robotics, artificial intelligence, and consumer electronics.

Politics

Facebook’s Oversight Board won’t save it from the Trump ban backlash

The Board's decision on whether to reinstate Trump could set a new precedent for Facebook. But does the average user care what the Board has to say?

A person holds a sign during a Free Speech Rally against tech companies, on Jan. 20 in California.

Photo: Valerie Macon/Getty Images

Two weeks after Facebook suspended former President Donald Trump's account indefinitely, Facebook answered a chorus of calls and referred the case to its newly created Oversight Board for review. Now, the board has 90 days to make a call as to whether Trump stays or goes permanently. The board's decision — and more specifically, how and why it arrives at that decision — could have consequences not only for other global leaders on Facebook, but for the future of the Board itself.

Facebook created its Oversight Board for such a time as this — a time when it would face a controversial content moderation decision and might need a gut check. Or a fall guy. There could be no decision more controversial than the one Facebook made on Jan. 7, when it decided to muzzle one of the most powerful people in the world with weeks remaining in his presidency. It stands to reason, then, that Facebook would tap in its newly anointed refs on the Oversight Board both to earnestly review the call and to put a little distance between Facebook and the decision.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Politics

This is the future of the FTC

President Joe Biden has named Becca Slaughter acting chair of the FTC. In conversation with Protocol, she laid out her priorities for the next four years.

FTC commissioner Becca Slaughter may be President Biden's pick for FTC chair.

Photo: David Becker/Getty Images

Becca Slaughter made a name for herself last year when, as a commissioner for the Federal Trade Commission, she breastfed her newborn baby during video testimony before the Senate, raising awareness about the plight of working parents during the pandemic.

But on Thursday, Slaughter's name began circulating for other reasons: She was just named as President Joe Biden's pick for acting chair of the FTC, an appointment that puts Slaughter at the head of antitrust investigations into tech giants, including Facebook.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Politics

The other reason Facebook silenced Trump? Republicans lost power.

Yes, the president's acts were unprecedented. But Facebook is also preparing for a new Washington, controlled by Democrats.

Mark Zuckerberg and Facebook's head of public policy Joel Kaplan have spent four years bending to conservatives' demands. Now, Facebook is bending in a new direction.

Photo: Samuel Corum/Getty Images

In his post announcing that President Trump would be blocked from posting on Facebook until at least Inauguration Day, Mark Zuckerberg wrote that the president's incitement of the violent mob that stormed the U.S. Capitol building Wednesday was "fundamentally different" than any of the offenses he's committed on Facebook before. "The risks of allowing the President to continue to use our service during this period are simply too great," he wrote on Thursday.

That may be true. But there's another reason why — after four years spent insisting that a tech company has no business shutting up the president of the United States, no matter how much he threatens to shoot protesters or engages in voter suppression — Zuckerberg finally had a change of heart: Republicans just lost power.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Power

Pressure mounts on tech giants to ban Trump, as rioters storm Capitol

Facebook, Twitter and YouTube removed a video in which Trump expressed love for the rioters, but none of the companies have banned him outright — yet.

Twitter locked President Trump's account.

Image: Twitter

Twitter, Facebook and YouTube took action against several of President Trump's posts Wednesday, labeling the posts, limiting reshares and removing a video in which President Trump expressed his love for rioters who stormed the U.S. Capitol building, leading to the evacuation of the Senate, the deployment of the National Guard and to one person being shot and killed. Twitter locked President Trump's account, requiring him to remove three tweets and saying that his account would remain locked for 12 hours after those tweets were removed. Twitter also warned that any future violations would get him banned. Facebook also locked his account for 24 hours, citing "two policy violations." These actions followed a day of calls from tech investors, academics and others to kick Trump off of their platforms once and for all.

In an early tweet, University of Virginia law professor Danielle Citron implored Twitter CEO Jack Dorsey to take action. "As someone who has served on your Trust and Safety Board since its inception and counseled you since 2009, time is now to suspend President Trump's account," Citron wrote. "He has deliberately incited violence, causing mayhem with his lies and threats."

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Latest Stories