Politics

Facebook's plan for privacy laws? 'Co-creating' them with Congress

In a newly published white paper, Facebook makes a case for a light-touch approach to privacy regulation that involves maximum flexibility for businesses.

A finger touching a phone with the Facebook logo

Facebook argues the best way to write privacy policy is to do it with industry input.

Image: Rafael Henrique/SOPA Images/LightRocket via Getty Images

Facebook's 16-year history is riddled with privacy blunders. There was Mark Zuckerberg's original sin of scraping students' photos to build a Hot or Not copycat at Harvard. There was the launch of News Feed, when Facebook began broadcasting every action users took on the platform to all of their friends. And, of course, there was the 2018 Cambridge Analytica scandal that exposed, though not for the first time, just how much data the company was willing to give away to third parties in the name of growth.

Now, the social networking giant has a modest proposal for lawmakers drafting privacy rules around the world: Let us help you write them.

In a new white paper published Wednesday, Facebook pushes for a light-touch approach to privacy regulation that involves maximum input from and flexibility for businesses. These, of course, are already the sorts of policies most tech giants are lobbying for behind closed doors. But the paper pushes for this collaboration to happen out in the open.

It argues, for instance, that the best way to design privacy regulations is through "policy co-creation," in which governments and companies work together to prototype policies and test their viability before they're implemented. It makes a case for regulations that "avoid or remove strict, one-size-fits-all design requirements," opting instead for laws that "regulate the process for making privacy design decisions, not the outcome of those processes."

In Singapore, Facebook has already tested these concepts through an organization it launched called Trust, Transparency and Control Labs. Together with the Singapore government, TTC Labs created what the paper calls a "regulatory sandbox," where startups could design new types of privacy notices and consent features and get feedback from regulators.

Of course, the United States is not Singapore, and Congress has hardly met Facebook with open arms recently. Protocol spoke with Facebook's deputy chief privacy officer, Rob Sherman, about what the company is proposing, who it's trying to convince, and why anyone should trust Facebook now.

This interview has been edited and condensed for clarity.

Who is this for? Who is the intended audience?

I think there are a number of intended audiences. One of the things we've realized in thinking about these problems within Facebook is governments are thinking about the right ways to regulate. Experts are thinking about what the right practices are, and companies are thinking about how to build for privacy and build for the communities they're serving. But they're not necessarily talking to each other.

Part of what we're trying to do is create a conversation that brings together those sets of stakeholders into a common conversation. It's something we've started to do through our Trust, Transparency and Control Labs initiative, which we founded. It holds a series of design jam workshops with experts, governments and companies to try to develop design solutions to some of these problems and put them out in openly accessible formats, so people can have examples of what it looks like to improve their practices.

A lot of the points in the paper struck me as Facebook saying lawmakers need to work with industry to collaborate on these regulations. That's something I imagine a lot of the industry would agree with, but regulators and privacy advocates would be pretty hesitant about. All I see them wanting to do lately is punch Facebook in the nose. What's giving you the sense this collaborative spirit exists in Congress?

In some of our efforts outside the U.S., already we've found a fair amount of interest on the parts of other companies and governments to have some of these conversations, because it all helps us get to a shared, better place.

One example is the regulatory sandbox we built with the Singapore government. This involves 14 companies working in a startup accelerator. They have resources, including privacy and nonprivacy expertise from Facebook, but also the ability to work with the government on best practices. That helps the government learn what works and what doesn't for smaller startups. And it helps the companies, and us for that matter, learn how to do these things at scale in practice.

That's Singapore. Right now, in the U.S. there's a lot of point-scoring trying to beat up on Big Tech. Why is this the moment to step in and say: The real solution to the privacy debate is to let us help you write these rules?

Getting this right is really critical. For people to be comfortable using Facebook, they need to trust we are both handling their data appropriately and communicating with them straightforwardly about that. The best way to do that is by talking to them, but also talking to other stakeholders in the ecosystem.

I also think when you look at areas outside of privacy — the financial sector's a good example — there are examples of co-created policies where industry gets together with experts and government to figure out what the right path forward is.

Have you broached this possibility with anyone in Congress, and if so, who? And how are those conversations going?

We view this as the beginning of the conversation, rather than the end. There aren't specific efforts with members to announce.

Communicating your privacy policy to the user comes last. First you need to have the policy in place that protects people's privacy. Where does Facebook stand in terms of privacy legislation that has been proposed in Congress? Is there anything you're supportive of?

We've been participating pretty actively in a number of different discussions around what privacy regulation might look like at the federal level and the state level. I think a lot of the discussions are going to align with the framework of giving people increased, clear rights over their data, the idea of putting specific obligations on companies to handle data responsibly, and identifying a regulator that's empowered to do that. Getting to a place where there's consistent federal standards around how we approach privacy is important so we can have a specific standard we can build to.

Are any of the bills in Congress bills you support?

I don't think we've expressed views on specific bills. The goal really at this point is to have conversations with a number of different stakeholders and try to get to the best place regardless of what bill is getting traction.

What about the California Privacy Rights Act, which looks like it has a good shot in November and would rewrite the California Consumer Privacy Act, which was a big deal when it was passed. Would this make things harder for you or do you support it?

It's something we've spent a lot of time thinking about. If it becomes law, it's something we will aim to comply with. It moves closer to something like [Europe's General Data Protection Regulation], when it comes to broadening the topics the legislation covers and giving people more rights over their data. I know there's a lot of debate on the ballot measure.

So, you aren't backing it or fighting it?

We haven't taken a position either for it or against.

Given you've been working on privacy at Facebook since 2012, how do you think you missed the possibility that giving app developers access to data on people's friend networks could be a privacy risk? If you're asking to be at the table with regulators to write the rules around privacy, they're going to point to the fact that you didn't get it right last time and ask why should they trust you now? So, explain how you missed that risk, or is it possible it wasn't missed, it's just that the business incentives of growing the platform outweighed the potential privacy risks?

When you look at the way we've approached communicating with people about their data in the context of the Facebook platform, that's something that's seen a pretty significant evolution over the years. It used to be the app permissions screen had a lot of information because that was the best practice at the time. It included the app developers' privacy policy with information they'd be getting and all of this detail. Over time, we've shifted toward simpler consent screens that are very clear about what developers wanted and that ask people to make a yes or no choice. That was an effort based on research and our understanding of how people interacted with those things.

A lot of the investment today is also around third-party oversight and making sure we have robust systems in place to make sure we're enforcing our policies and making sure developers that get access to data through Facebook's systems, even with people's consent, are adhering to the standards they've agreed to.

Obviously it was a problem that people didn't know what they were agreeing to in the permissions page, but the communication part came after the policy was created allowing app developers to access people's friends' data in the first place. How did you miss that that was a privacy flaw?

It was something we considered and that we improved over time as a part of the way that we approached Platform. You saw changes in 2014. You saw changes in 2018. In parallel to that you saw changes in the way we communicated.

It's clear there's a lot we could have done differently back then to avoid some of the challenges that we're facing now, but I think we've tried to invest really aggressively in addressing those and getting to a better place. The hope is that the learnings we've made through making mistakes and trying to improve our approach will help other companies and the broader policy discussion get to a more nuanced place.

Protocol | Workplace

Productivity apps can’t stop making money

ClickUp had one of the biggest Series C funding rounds ever. Here's how it matches up to the other productivity unicorns.

ClickUp made $400 million in its series C funding round.

Photo: ClickUp

Productivity platform ClickUp announced a milestone today. The company raised $400 million, which is one of the biggest series C funding rounds in the workplace productivity market ever. The round, led by Andreessen Horowitz and Tiger Global, put the private company at a $4 billion valuation post-money.

In case it's not clear: This is a massive amount of money. It shows how hot the productivity space is right now, with some predicting the market size could reach almost $120 billion by 2028. In a world of hybrid workers, all-in-one tool platforms are all the rage among both startups and productivity stalwarts. Companies everywhere want to escape tool overwhelm, where work is spread across dozens of apps.

Keep Reading Show less
Lizzy Lawrence

Lizzy Lawrence ( @LizzyLaw_) is a reporter at Protocol, covering tools and productivity in the workplace. She's a recent graduate of the University of Michigan, where she studied sociology and international studies. She served as editor in chief of The Michigan Daily, her school's independent newspaper. She's based in D.C., and can be reached at llawrence@protocol.com.

If you've ever tried to pick up a new fitness routine like running, chances are you may have fallen into the "motivation vs. habit" trap once or twice. You go for a run when the sun is shining, only to quickly fall off the wagon when the weather turns sour.

Similarly, for many businesses, 2020 acted as the storm cloud that disrupted their plans for innovation. With leaders busy grappling with the pandemic, innovation frequently got pushed to the backburner. In fact, according to McKinsey, the majority of organizations shifted their focus mainly to maintaining business continuity throughout the pandemic.

Keep Reading Show less
Gaurav Kataria
Group Product Manager, Trello at Atlassian
The Supreme Court of the United States
Photo: Angel Xavier Viera-Vargas

If a company resolved a data breach in the past, does it need to disclose the potential negative fallout of that breach as a risk to investors later on? In a new petition asking the Supreme Court to take up the question, Alphabet is arguing emphatically: no. And it's using the ol' "the past is history, tomorrow's a mystery" defense.

Keep Reading Show less
Issie Lapowsky

Issie Lapowsky ( @issielapowsky) is Protocol's chief correspondent, covering the intersection of technology, politics, and national affairs. She also oversees Protocol's fellowship program. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University's Center for Publishing on how tech giants have affected publishing.

Protocol | Workplace

Facebook’s hiring crisis: Engineers are turning down offers

"All of you are now starting to experience that major imbalance between supply and demand — and it doesn't feel good," a recruiting leader wrote in an internal memo.

Here are all the Facebook Papers stories
Image: Getty Images, Protocol

Facebook cannot find enough candidates to meet engineering demand, especially in the Bay Area, and has struggled and failed to meet early 2021 recruiting goals, according to a detailed internal memo outlining recruitment strategy and hiring pains.

The company also failed to meet hiring goals in 2019, which frustrated CEO Mark Zuckerberg, and it built an ad-hoc team of leaders to create an emergency plan to address the painful shortage, according to disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by Frances Haugen's legal counsel. A consortium of news organizations, including Protocol, has reviewed the redacted versions received by Congress.

Keep Reading Show less
Anna Kramer

Anna Kramer is a reporter at Protocol (Twitter: @ anna_c_kramer, email: akramer@protocol.com), where she writes about labor and workplace issues. Prior to joining the team, she covered tech and small business for the San Francisco Chronicle and privacy for Bloomberg Law. She is a recent graduate of Brown University, where she studied International Relations and Arabic and wrote her senior thesis about surveillance tools and technological development in the Middle East.

Theranos trial reveals DeVos family invested $100 million

The family committed "on the spot" to double its investment, an investment adviser said. Meanwhile, the jury lost another two members, with two alternates left.

Betsy DeVos' family invested $100 million in Theranos, an investment adviser said.

Photo: Alex Wong/Getty Images

Lisa Peterson, a wealth manager for the DeVos family, testified in Elizabeth Holmes's criminal fraud trial Tuesday, as prosecutors continued to highlight allegations about how the Theranos CEO courted investors in the once-high-flying blood-testing startup.

An email presented by the defense revealed that the family committed to doubling their investment in Theranos to $100 million "on the spot" during a 2014 visit to company headquarters.

Keep Reading Show less
Michelle Ma
Michelle Ma (@himichellema) is a reporter at Protocol, where she writes about management, leadership and workplace issues in tech. Previously, she was a news editor of live journalism and special coverage for The Wall Street Journal. Prior to that, she worked as a staff writer at Wirecutter. She can be reached at mma@protocol.com.
Latest Stories