Politics

Facebook's plan for privacy laws? 'Co-creating' them with Congress

In a newly published white paper, Facebook makes a case for a light-touch approach to privacy regulation that involves maximum flexibility for businesses.

A finger touching a phone with the Facebook logo

Facebook argues the best way to write privacy policy is to do it with industry input.

Image: Rafael Henrique/SOPA Images/LightRocket via Getty Images

Facebook's 16-year history is riddled with privacy blunders. There was Mark Zuckerberg's original sin of scraping students' photos to build a Hot or Not copycat at Harvard. There was the launch of News Feed, when Facebook began broadcasting every action users took on the platform to all of their friends. And, of course, there was the 2018 Cambridge Analytica scandal that exposed, though not for the first time, just how much data the company was willing to give away to third parties in the name of growth.

Now, the social networking giant has a modest proposal for lawmakers drafting privacy rules around the world: Let us help you write them.

In a new white paper published Wednesday, Facebook pushes for a light-touch approach to privacy regulation that involves maximum input from and flexibility for businesses. These, of course, are already the sorts of policies most tech giants are lobbying for behind closed doors. But the paper pushes for this collaboration to happen out in the open.

It argues, for instance, that the best way to design privacy regulations is through "policy co-creation," in which governments and companies work together to prototype policies and test their viability before they're implemented. It makes a case for regulations that "avoid or remove strict, one-size-fits-all design requirements," opting instead for laws that "regulate the process for making privacy design decisions, not the outcome of those processes."

In Singapore, Facebook has already tested these concepts through an organization it launched called Trust, Transparency and Control Labs. Together with the Singapore government, TTC Labs created what the paper calls a "regulatory sandbox," where startups could design new types of privacy notices and consent features and get feedback from regulators.

Of course, the United States is not Singapore, and Congress has hardly met Facebook with open arms recently. Protocol spoke with Facebook's deputy chief privacy officer, Rob Sherman, about what the company is proposing, who it's trying to convince, and why anyone should trust Facebook now.

This interview has been edited and condensed for clarity.

Who is this for? Who is the intended audience?

I think there are a number of intended audiences. One of the things we've realized in thinking about these problems within Facebook is governments are thinking about the right ways to regulate. Experts are thinking about what the right practices are, and companies are thinking about how to build for privacy and build for the communities they're serving. But they're not necessarily talking to each other.

Part of what we're trying to do is create a conversation that brings together those sets of stakeholders into a common conversation. It's something we've started to do through our Trust, Transparency and Control Labs initiative, which we founded. It holds a series of design jam workshops with experts, governments and companies to try to develop design solutions to some of these problems and put them out in openly accessible formats, so people can have examples of what it looks like to improve their practices.

A lot of the points in the paper struck me as Facebook saying lawmakers need to work with industry to collaborate on these regulations. That's something I imagine a lot of the industry would agree with, but regulators and privacy advocates would be pretty hesitant about. All I see them wanting to do lately is punch Facebook in the nose. What's giving you the sense this collaborative spirit exists in Congress?

In some of our efforts outside the U.S., already we've found a fair amount of interest on the parts of other companies and governments to have some of these conversations, because it all helps us get to a shared, better place.

One example is the regulatory sandbox we built with the Singapore government. This involves 14 companies working in a startup accelerator. They have resources, including privacy and nonprivacy expertise from Facebook, but also the ability to work with the government on best practices. That helps the government learn what works and what doesn't for smaller startups. And it helps the companies, and us for that matter, learn how to do these things at scale in practice.

That's Singapore. Right now, in the U.S. there's a lot of point-scoring trying to beat up on Big Tech. Why is this the moment to step in and say: The real solution to the privacy debate is to let us help you write these rules?

Getting this right is really critical. For people to be comfortable using Facebook, they need to trust we are both handling their data appropriately and communicating with them straightforwardly about that. The best way to do that is by talking to them, but also talking to other stakeholders in the ecosystem.

I also think when you look at areas outside of privacy — the financial sector's a good example — there are examples of co-created policies where industry gets together with experts and government to figure out what the right path forward is.

Have you broached this possibility with anyone in Congress, and if so, who? And how are those conversations going?

We view this as the beginning of the conversation, rather than the end. There aren't specific efforts with members to announce.

Communicating your privacy policy to the user comes last. First you need to have the policy in place that protects people's privacy. Where does Facebook stand in terms of privacy legislation that has been proposed in Congress? Is there anything you're supportive of?

We've been participating pretty actively in a number of different discussions around what privacy regulation might look like at the federal level and the state level. I think a lot of the discussions are going to align with the framework of giving people increased, clear rights over their data, the idea of putting specific obligations on companies to handle data responsibly, and identifying a regulator that's empowered to do that. Getting to a place where there's consistent federal standards around how we approach privacy is important so we can have a specific standard we can build to.

Are any of the bills in Congress bills you support?

I don't think we've expressed views on specific bills. The goal really at this point is to have conversations with a number of different stakeholders and try to get to the best place regardless of what bill is getting traction.

What about the California Privacy Rights Act, which looks like it has a good shot in November and would rewrite the California Consumer Privacy Act, which was a big deal when it was passed. Would this make things harder for you or do you support it?

It's something we've spent a lot of time thinking about. If it becomes law, it's something we will aim to comply with. It moves closer to something like [Europe's General Data Protection Regulation], when it comes to broadening the topics the legislation covers and giving people more rights over their data. I know there's a lot of debate on the ballot measure.

So, you aren't backing it or fighting it?

We haven't taken a position either for it or against.

Given you've been working on privacy at Facebook since 2012, how do you think you missed the possibility that giving app developers access to data on people's friend networks could be a privacy risk? If you're asking to be at the table with regulators to write the rules around privacy, they're going to point to the fact that you didn't get it right last time and ask why should they trust you now? So, explain how you missed that risk, or is it possible it wasn't missed, it's just that the business incentives of growing the platform outweighed the potential privacy risks?

When you look at the way we've approached communicating with people about their data in the context of the Facebook platform, that's something that's seen a pretty significant evolution over the years. It used to be the app permissions screen had a lot of information because that was the best practice at the time. It included the app developers' privacy policy with information they'd be getting and all of this detail. Over time, we've shifted toward simpler consent screens that are very clear about what developers wanted and that ask people to make a yes or no choice. That was an effort based on research and our understanding of how people interacted with those things.

A lot of the investment today is also around third-party oversight and making sure we have robust systems in place to make sure we're enforcing our policies and making sure developers that get access to data through Facebook's systems, even with people's consent, are adhering to the standards they've agreed to.

Obviously it was a problem that people didn't know what they were agreeing to in the permissions page, but the communication part came after the policy was created allowing app developers to access people's friends' data in the first place. How did you miss that that was a privacy flaw?

It was something we considered and that we improved over time as a part of the way that we approached Platform. You saw changes in 2014. You saw changes in 2018. In parallel to that you saw changes in the way we communicated.

It's clear there's a lot we could have done differently back then to avoid some of the challenges that we're facing now, but I think we've tried to invest really aggressively in addressing those and getting to a better place. The hope is that the learnings we've made through making mistakes and trying to improve our approach will help other companies and the broader policy discussion get to a more nuanced place.

Enterprise

Docker doubles down on developer experience with $105M in new funding

The former high-flying container leader is focusing on simplifying the developer experience around Kubernetes, ensuring application security and finding growth outside its core markets as part of its post-restructuring rebound.

The company, which marked its ninth year this month, has retrained its efforts on making application development easier for software developers with tools and services.

Photo: Docker

Docker just landed $105 million in new funding, and lead investor Bain Capital Ventures thinks the container pioneer is in line for its long-awaited IPO down the road.

The new Series C funding acknowledges Docker’s turnaround progress since its recapitalization and restructuring in November 2019, according to Docker CEO Scott Johnston. That’s when Docker ditched its enterprise software strategy with the sale of its Docker Enterprise platform business to Mirantis and refocused its product and marketing strategy on its developer roots.

Keep Reading Show less
Sponsored Content

How workplace tech can help slow the great resignation

The world of work is undergoing a revolution – but workplace communication tools help businesses adapt.

Workplace Tech on the Frontline

This is part one of a three-part series exploring the experience of frontline workers and new workplace tools being deployed to support them.

The last two years have seen deep, significant changes to the world of work. The COVID-19 pandemic has shifted business leaders’ focus from maintaining their bottom line to the front line. Employee experience has become more important than ever to keep good workers happy – and to keep them within a business.

Keep Reading Show less
Chris Stokel-Walker

Chris Stokel-Walker is a freelance technology and culture journalist and author of "YouTubers: How YouTube Shook Up TV and Created a New Generation of Stars." His work has been published in The New York Times, The Guardian and Wired.

Fintech

Think you know how NFT taxes work? Try getting a write-off.

Tax season is here, but NFT tax guidance isn’t. That means major headaches for anyone who bought, sold or donated an NFT.

NFT taxes remain blurry to many, including accountants.

Illustration: Fairywong/DigitalVision Vectors/Getty Images; Protocol

2021 was the year NFTs became mainstream. And that means as Tax Day on April 18 looms, all those new NFT buyers may find themselves dealing with thorny questions they’ve never faced before: How exactly does one file taxes on NFT sales? Or what if I donated an NFT and want to take a write-off? (Don’t take a shot every time NFT appears in this story; you won’t survive. Or maybe that’s the only way you’ll make it to April 18. We don’t judge.)

The IRS first offered guidance on virtual currency transactions in 2014. That’s offered some basis for interpreting the rules around NFTs, but there are many unanswered questions. As a result, NFT taxes remain blurry to many, including accountants.

Keep Reading Show less
Lindsey Choo
Lindsey Choo is a San Francisco-based reporter covering fintech. She is a graduate of UC San Diego, where she double majored in communications and political science. She has previously covered healthcare issues for the Center for Healthy Aging and was a senior staff writer for The UCSD Guardian. She can be reached at lchoo@protocol.com.
Policy

How social media became a ‘debate-themed video game’

Justin E. H. Smith, author of “The Internet Is Not What You Think It Is: A History, a Philosophy, a Warning,” spoke with Protocol about his genealogy of the internet.

In his book, Smith argues that the internet as we know it is addictive, undemocratic and “shapes human lives algorithmically.”

Image: Princeton University Press; Protocol

“The internet is simultaneously our greatest affliction and our greatest hope; the present situation is intolerable, but there is also no going back,” Justin E. H. Smith writes in his new book, “The Internet Is Not What You Think It Is: A History, a Philosophy, a Warning.”

Smith, a professor of history and philosophy of science at the University of Paris, outlines the affliction of the internet in detail. He argues that the internet as we know it is addictive, undemocratic and “shapes human lives algorithmically, and human lives under the pressure of algorithms are not enhanced but rather warped and impoverished.”

Keep Reading Show less
Hirsh Chitkara

Hirsh Chitkara ( @HirshChitkara) is a reporter at Protocol focused on the intersection of politics, technology and society. Before joining Protocol, he helped write a daily newsletter at Insider that covered all things Big Tech. He's based in New York and can be reached at hchitkara@protocol.com.

Workplace

Sexual harassment training is outdated. VR might be a fix.

VR sexual harassment and discrimination training might be the next frontier in the metaverse.

Some companies have been working to bring harassment prevention training into the present, fusing seminars with virtual reality.

Photo: Moth+Flame

If you’ve taken a sexual harassment training seminar at work recently, chances are you’re learning from materials that are at least a decade old.

Sexual harassment training has remained stuck in the past, with many workplaces still relying on PowerPoints or videos that don’t engage users, fail to promote empathy and haven’t done enough to prevent or reduce workplace harassment on a large scale. An analysis by an assistant professor at the University of Oregon School of Law found that although the delivery formats of harassment training have changed, the nature of the content has remained fundamentally unchanged since the ‘80s and ‘90s. Meanwhile, harassment persists in workplaces, with 59% of women and 27% of men reporting receiving unwanted sexual advances or verbal or physical harassment, according to a 2018 survey by Pew Research Center.

Keep Reading Show less
Nat Rubio-Licht

Nat Rubio-Licht is a Los Angeles-based news writer at Protocol. They graduated from Syracuse University with a degree in newspaper and online journalism in May 2020. Prior to joining the team, they worked at the Los Angeles Business Journal as a technology and aerospace reporter.

Latest Stories
Bulletins