Politics

Facebook's plan for privacy laws? 'Co-creating' them with Congress

In a newly published white paper, Facebook makes a case for a light-touch approach to privacy regulation that involves maximum flexibility for businesses.

A finger touching a phone with the Facebook logo

Facebook argues the best way to write privacy policy is to do it with industry input.

Image: Rafael Henrique/SOPA Images/LightRocket via Getty Images

Facebook's 16-year history is riddled with privacy blunders. There was Mark Zuckerberg's original sin of scraping students' photos to build a Hot or Not copycat at Harvard. There was the launch of News Feed, when Facebook began broadcasting every action users took on the platform to all of their friends. And, of course, there was the 2018 Cambridge Analytica scandal that exposed, though not for the first time, just how much data the company was willing to give away to third parties in the name of growth.

Now, the social networking giant has a modest proposal for lawmakers drafting privacy rules around the world: Let us help you write them.

In a new white paper published Wednesday, Facebook pushes for a light-touch approach to privacy regulation that involves maximum input from and flexibility for businesses. These, of course, are already the sorts of policies most tech giants are lobbying for behind closed doors. But the paper pushes for this collaboration to happen out in the open.

It argues, for instance, that the best way to design privacy regulations is through "policy co-creation," in which governments and companies work together to prototype policies and test their viability before they're implemented. It makes a case for regulations that "avoid or remove strict, one-size-fits-all design requirements," opting instead for laws that "regulate the process for making privacy design decisions, not the outcome of those processes."

In Singapore, Facebook has already tested these concepts through an organization it launched called Trust, Transparency and Control Labs. Together with the Singapore government, TTC Labs created what the paper calls a "regulatory sandbox," where startups could design new types of privacy notices and consent features and get feedback from regulators.

Of course, the United States is not Singapore, and Congress has hardly met Facebook with open arms recently. Protocol spoke with Facebook's deputy chief privacy officer, Rob Sherman, about what the company is proposing, who it's trying to convince, and why anyone should trust Facebook now.

This interview has been edited and condensed for clarity.

Who is this for? Who is the intended audience?

I think there are a number of intended audiences. One of the things we've realized in thinking about these problems within Facebook is governments are thinking about the right ways to regulate. Experts are thinking about what the right practices are, and companies are thinking about how to build for privacy and build for the communities they're serving. But they're not necessarily talking to each other.

Part of what we're trying to do is create a conversation that brings together those sets of stakeholders into a common conversation. It's something we've started to do through our Trust, Transparency and Control Labs initiative, which we founded. It holds a series of design jam workshops with experts, governments and companies to try to develop design solutions to some of these problems and put them out in openly accessible formats, so people can have examples of what it looks like to improve their practices.

A lot of the points in the paper struck me as Facebook saying lawmakers need to work with industry to collaborate on these regulations. That's something I imagine a lot of the industry would agree with, but regulators and privacy advocates would be pretty hesitant about. All I see them wanting to do lately is punch Facebook in the nose. What's giving you the sense this collaborative spirit exists in Congress?

In some of our efforts outside the U.S., already we've found a fair amount of interest on the parts of other companies and governments to have some of these conversations, because it all helps us get to a shared, better place.

One example is the regulatory sandbox we built with the Singapore government. This involves 14 companies working in a startup accelerator. They have resources, including privacy and nonprivacy expertise from Facebook, but also the ability to work with the government on best practices. That helps the government learn what works and what doesn't for smaller startups. And it helps the companies, and us for that matter, learn how to do these things at scale in practice.

That's Singapore. Right now, in the U.S. there's a lot of point-scoring trying to beat up on Big Tech. Why is this the moment to step in and say: The real solution to the privacy debate is to let us help you write these rules?

Getting this right is really critical. For people to be comfortable using Facebook, they need to trust we are both handling their data appropriately and communicating with them straightforwardly about that. The best way to do that is by talking to them, but also talking to other stakeholders in the ecosystem.

I also think when you look at areas outside of privacy — the financial sector's a good example — there are examples of co-created policies where industry gets together with experts and government to figure out what the right path forward is.

Have you broached this possibility with anyone in Congress, and if so, who? And how are those conversations going?

We view this as the beginning of the conversation, rather than the end. There aren't specific efforts with members to announce.

Communicating your privacy policy to the user comes last. First you need to have the policy in place that protects people's privacy. Where does Facebook stand in terms of privacy legislation that has been proposed in Congress? Is there anything you're supportive of?

We've been participating pretty actively in a number of different discussions around what privacy regulation might look like at the federal level and the state level. I think a lot of the discussions are going to align with the framework of giving people increased, clear rights over their data, the idea of putting specific obligations on companies to handle data responsibly, and identifying a regulator that's empowered to do that. Getting to a place where there's consistent federal standards around how we approach privacy is important so we can have a specific standard we can build to.

Are any of the bills in Congress bills you support?

I don't think we've expressed views on specific bills. The goal really at this point is to have conversations with a number of different stakeholders and try to get to the best place regardless of what bill is getting traction.

What about the California Privacy Rights Act, which looks like it has a good shot in November and would rewrite the California Consumer Privacy Act, which was a big deal when it was passed. Would this make things harder for you or do you support it?

It's something we've spent a lot of time thinking about. If it becomes law, it's something we will aim to comply with. It moves closer to something like [Europe's General Data Protection Regulation], when it comes to broadening the topics the legislation covers and giving people more rights over their data. I know there's a lot of debate on the ballot measure.

So, you aren't backing it or fighting it?

We haven't taken a position either for it or against.

Given you've been working on privacy at Facebook since 2012, how do you think you missed the possibility that giving app developers access to data on people's friend networks could be a privacy risk? If you're asking to be at the table with regulators to write the rules around privacy, they're going to point to the fact that you didn't get it right last time and ask why should they trust you now? So, explain how you missed that risk, or is it possible it wasn't missed, it's just that the business incentives of growing the platform outweighed the potential privacy risks?

When you look at the way we've approached communicating with people about their data in the context of the Facebook platform, that's something that's seen a pretty significant evolution over the years. It used to be the app permissions screen had a lot of information because that was the best practice at the time. It included the app developers' privacy policy with information they'd be getting and all of this detail. Over time, we've shifted toward simpler consent screens that are very clear about what developers wanted and that ask people to make a yes or no choice. That was an effort based on research and our understanding of how people interacted with those things.

A lot of the investment today is also around third-party oversight and making sure we have robust systems in place to make sure we're enforcing our policies and making sure developers that get access to data through Facebook's systems, even with people's consent, are adhering to the standards they've agreed to.

Obviously it was a problem that people didn't know what they were agreeing to in the permissions page, but the communication part came after the policy was created allowing app developers to access people's friends' data in the first place. How did you miss that that was a privacy flaw?

It was something we considered and that we improved over time as a part of the way that we approached Platform. You saw changes in 2014. You saw changes in 2018. In parallel to that you saw changes in the way we communicated.

It's clear there's a lot we could have done differently back then to avoid some of the challenges that we're facing now, but I think we've tried to invest really aggressively in addressing those and getting to a better place. The hope is that the learnings we've made through making mistakes and trying to improve our approach will help other companies and the broader policy discussion get to a more nuanced place.

Policy

Google is wooing a coalition of civil rights allies. It’s working.

The tech giant is adept at winning friends even when it’s not trying to immediately influence people.

A map display of Washington lines the floor next to the elevators at the Google office in Washington, D.C.

Photo: Andrew Harrer/Bloomberg via Getty Images

As Google has faced intensifying pressure from policymakers in recent years, it’s founded trade associations, hired a roster of former top government officials and sometimes spent more than $20 million annually on federal lobbying.

But the company has also become famous in Washington for nurturing less clearly mercenary ties. It has long funded the work of laissez-faire economists who now defend it against antitrust charges, for instance. It’s making inroads with traditional business associations that once pummeled it on policy, and also supports think tanks and advocacy groups.

Keep Reading Show less
Ben Brody

Ben Brody (@ BenBrodyDC) is a senior reporter at Protocol focusing on how Congress, courts and agencies affect the online world we live in. He formerly covered tech policy and lobbying (including antitrust, Section 230 and privacy) at Bloomberg News, where he previously reported on the influence industry, government ethics and the 2016 presidential election. Before that, Ben covered business news at CNNMoney and AdAge, and all manner of stories in and around New York. He still loves appearing on the New York news radio he grew up with.

Sustainability. It can be a charged word in the context of blockchain and crypto – whether from outsiders with a limited view of the technology or from insiders using it for competitive advantage. But as a CEO in the industry, I don’t think either of those approaches helps us move forward. We should all be able to agree that using less energy to get a task done is a good thing and that there is room for improvement in the amount of energy that is consumed to power different blockchain technologies.

So, what if we put the enormous industry talent and minds that have created and developed blockchain to the task of building in a more energy-efficient manner? Can we not just solve the issues but also set the standard for other industries to develop technology in a future-proof way?

Keep Reading Show less
Denelle Dixon, CEO of SDF

Denelle Dixon is CEO and Executive Director of the Stellar Development Foundation, a non-profit using blockchain to unlock economic potential by making money more fluid, markets more open, and people more empowered. Previously, Dixon served as COO of Mozilla. Leading the business, revenue and policy teams, she fought for Net Neutrality and consumer privacy protections and was responsible for commercial partnerships. Denelle also served as general counsel and legal advisor in private equity and technology.

Workplace

Everything you need to know about tech layoffs and hiring slowdowns

Will tech companies and startups continue to have layoffs?

It’s not just early-stage startups that are feeling the burn.

Photo: Kirsty O'Connor/PA Images via Getty Images

What goes up must come down.

High-flying startups with record valuations, huge hiring goals and ambitious expansion plans are now announcing hiring slowdowns, freezes and in some cases widespread layoffs. It’s the dot-com bust all over again — this time, without the cute sock puppet and in the midst of a global pandemic we just can’t seem to shake.

Keep Reading Show less
Nat Rubio-Licht

Nat Rubio-Licht is a Los Angeles-based news writer at Protocol. They graduated from Syracuse University with a degree in newspaper and online journalism in May 2020. Prior to joining the team, they worked at the Los Angeles Business Journal as a technology and aerospace reporter.

Entertainment

Sink into ‘Love, Death & Robots’ and more weekend recs

Don’t know what to do this weekend? We’ve got you covered.

Our favorite picks for your weekend pleasure.

Image: A24; 11 bit studios; Getty Images

We could all use a bit of a break. This weekend we’re diving into Netflix’s beautifully animated sci-fi “Love, Death & Robots,” losing ourselves in surreal “Men” and loving Zelda-like Moonlighter.

Keep Reading Show less
Nick Statt

Nick Statt is Protocol's video game reporter. Prior to joining Protocol, he was news editor at The Verge covering the gaming industry, mobile apps and antitrust out of San Francisco, in addition to managing coverage of Silicon Valley tech giants and startups. He now resides in Rochester, New York, home of the garbage plate and, completely coincidentally, the World Video Game Hall of Fame. He can be reached at nstatt@protocol.com.

Workplace

This machine would like to interview you for a job

Companies are embracing automated video interviews to filter through floods of job applicants. But interviews with a computer screen raise big ethical questions and might scare off candidates.

Although automated interview companies claim to reduce bias in hiring, the researchers and advocates who study AI bias are these companies’ most frequent critics.

Photo: Johner Images via Getty Images

Applying for a job these days is starting to feel a lot like online dating. Job-seekers send their resume into portal after portal and a silent abyss waits on the other side.

That abyss is silent for a reason and it has little to do with the still-tight job market or the quality of your particular resume. On the other side of the portal, hiring managers watch the hundreds and even thousands of resumes pile up. It’s an infinite mountain of digital profiles, most of them from people completely unqualified. Going through them all would be a virtually fruitless task.

Keep Reading Show less
Anna Kramer

Anna Kramer is a reporter at Protocol (Twitter: @ anna_c_kramer, email: akramer@protocol.com), where she writes about labor and workplace issues. Prior to joining the team, she covered tech and small business for the San Francisco Chronicle and privacy for Bloomberg Law. She is a recent graduate of Brown University, where she studied International Relations and Arabic and wrote her senior thesis about surveillance tools and technological development in the Middle East.

Latest Stories
Bulletins