Facebook's 16-year history is riddled with privacy blunders. There was Mark Zuckerberg's original sin of scraping students' photos to build a Hot or Not copycat at Harvard. There was the launch of News Feed, when Facebook began broadcasting every action users took on the platform to all of their friends. And, of course, there was the 2018 Cambridge Analytica scandal that exposed, though not for the first time, just how much data the company was willing to give away to third parties in the name of growth.
Now, the social networking giant has a modest proposal for lawmakers drafting privacy rules around the world: Let us help you write them.
In a new white paper published Wednesday, Facebook pushes for a light-touch approach to privacy regulation that involves maximum input from and flexibility for businesses. These, of course, are already the sorts of policies most tech giants are lobbying for behind closed doors. But the paper pushes for this collaboration to happen out in the open.
It argues, for instance, that the best way to design privacy regulations is through "policy co-creation," in which governments and companies work together to prototype policies and test their viability before they're implemented. It makes a case for regulations that "avoid or remove strict, one-size-fits-all design requirements," opting instead for laws that "regulate the process for making privacy design decisions, not the outcome of those processes."
In Singapore, Facebook has already tested these concepts through an organization it launched called Trust, Transparency and Control Labs. Together with the Singapore government, TTC Labs created what the paper calls a "regulatory sandbox," where startups could design new types of privacy notices and consent features and get feedback from regulators.
Of course, the United States is not Singapore, and Congress has hardly met Facebook with open arms recently. Protocol spoke with Facebook's deputy chief privacy officer, Rob Sherman, about what the company is proposing, who it's trying to convince, and why anyone should trust Facebook now.
This interview has been edited and condensed for clarity.
Who is this for? Who is the intended audience?
I think there are a number of intended audiences. One of the things we've realized in thinking about these problems within Facebook is governments are thinking about the right ways to regulate. Experts are thinking about what the right practices are, and companies are thinking about how to build for privacy and build for the communities they're serving. But they're not necessarily talking to each other.
Part of what we're trying to do is create a conversation that brings together those sets of stakeholders into a common conversation. It's something we've started to do through our Trust, Transparency and Control Labs initiative, which we founded. It holds a series of design jam workshops with experts, governments and companies to try to develop design solutions to some of these problems and put them out in openly accessible formats, so people can have examples of what it looks like to improve their practices.
A lot of the points in the paper struck me as Facebook saying lawmakers need to work with industry to collaborate on these regulations. That's something I imagine a lot of the industry would agree with, but regulators and privacy advocates would be pretty hesitant about. All I see them wanting to do lately is punch Facebook in the nose. What's giving you the sense this collaborative spirit exists in Congress?
In some of our efforts outside the U.S., already we've found a fair amount of interest on the parts of other companies and governments to have some of these conversations, because it all helps us get to a shared, better place.
One example is the regulatory sandbox we built with the Singapore government. This involves 14 companies working in a startup accelerator. They have resources, including privacy and nonprivacy expertise from Facebook, but also the ability to work with the government on best practices. That helps the government learn what works and what doesn't for smaller startups. And it helps the companies, and us for that matter, learn how to do these things at scale in practice.
That's Singapore. Right now, in the U.S. there's a lot of point-scoring trying to beat up on Big Tech. Why is this the moment to step in and say: The real solution to the privacy debate is to let us help you write these rules?
Getting this right is really critical. For people to be comfortable using Facebook, they need to trust we are both handling their data appropriately and communicating with them straightforwardly about that. The best way to do that is by talking to them, but also talking to other stakeholders in the ecosystem.
I also think when you look at areas outside of privacy — the financial sector's a good example — there are examples of co-created policies where industry gets together with experts and government to figure out what the right path forward is.
Have you broached this possibility with anyone in Congress, and if so, who? And how are those conversations going?
We view this as the beginning of the conversation, rather than the end. There aren't specific efforts with members to announce.
Communicating your privacy policy to the user comes last. First you need to have the policy in place that protects people's privacy. Where does Facebook stand in terms of privacy legislation that has been proposed in Congress? Is there anything you're supportive of?
We've been participating pretty actively in a number of different discussions around what privacy regulation might look like at the federal level and the state level. I think a lot of the discussions are going to align with the framework of giving people increased, clear rights over their data, the idea of putting specific obligations on companies to handle data responsibly, and identifying a regulator that's empowered to do that. Getting to a place where there's consistent federal standards around how we approach privacy is important so we can have a specific standard we can build to.
Are any of the bills in Congress bills you support?
I don't think we've expressed views on specific bills. The goal really at this point is to have conversations with a number of different stakeholders and try to get to the best place regardless of what bill is getting traction.
What about the California Privacy Rights Act, which looks like it has a good shot in November and would rewrite the California Consumer Privacy Act, which was a big deal when it was passed. Would this make things harder for you or do you support it?
It's something we've spent a lot of time thinking about. If it becomes law, it's something we will aim to comply with. It moves closer to something like [Europe's General Data Protection Regulation], when it comes to broadening the topics the legislation covers and giving people more rights over their data. I know there's a lot of debate on the ballot measure.
So, you aren't backing it or fighting it?
We haven't taken a position either for it or against.
Given you've been working on privacy at Facebook since 2012, how do you think you missed the possibility that giving app developers access to data on people's friend networks could be a privacy risk? If you're asking to be at the table with regulators to write the rules around privacy, they're going to point to the fact that you didn't get it right last time and ask why should they trust you now? So, explain how you missed that risk, or is it possible it wasn't missed, it's just that the business incentives of growing the platform outweighed the potential privacy risks?
When you look at the way we've approached communicating with people about their data in the context of the Facebook platform, that's something that's seen a pretty significant evolution over the years. It used to be the app permissions screen had a lot of information because that was the best practice at the time. It included the app developers' privacy policy with information they'd be getting and all of this detail. Over time, we've shifted toward simpler consent screens that are very clear about what developers wanted and that ask people to make a yes or no choice. That was an effort based on research and our understanding of how people interacted with those things.
A lot of the investment today is also around third-party oversight and making sure we have robust systems in place to make sure we're enforcing our policies and making sure developers that get access to data through Facebook's systems, even with people's consent, are adhering to the standards they've agreed to.
Obviously it was a problem that people didn't know what they were agreeing to in the permissions page, but the communication part came after the policy was created allowing app developers to access people's friends' data in the first place. How did you miss that that was a privacy flaw?
It was something we considered and that we improved over time as a part of the way that we approached Platform. You saw changes in 2014. You saw changes in 2018. In parallel to that you saw changes in the way we communicated.
It's clear there's a lot we could have done differently back then to avoid some of the challenges that we're facing now, but I think we've tried to invest really aggressively in addressing those and getting to a better place. The hope is that the learnings we've made through making mistakes and trying to improve our approach will help other companies and the broader policy discussion get to a more nuanced place.