Proton’s plan for a more private internet

Andy Yen joins the Source Code podcast to talk about building privacy-focused apps, the encryption backdoor debate and whether Apple is actually fighting for the user.

ProtonMail app

ProtonMail's encrypted email was just the beginning for Proton Technologies.

Photo: ProtonMail

The team behind ProtonMail didn't necessarily start out to build an email client. They started out with a big idea: that privacy online is worth preserving, and that it's possible to build great products that preserve that privacy. They started with email, CEO Andy Yen said, because in the 21st century "your email address is your online identity." Since then, the Proton team has built a VPN, a calendar, a file storage system, and has much more on the roadmap.

Yen joined the Source Code Podcast to talk about Proton's story, what it takes to build products with privacy as a first-class feature and why that's getting easier to do all the time. He also talked about the debate over encryption backdoors for law enforcement, whether Apple's push for privacy is really about doing right by users and the other privacy-focused products the internet needs most.

Subscribe to the show: Apple Podcasts | Spotify | Google Podcasts | RSS

The following excerpts from our interview have been edited for length and clarity.

In the early days of ProtonMail, how does thinking about privacy change the way that you start and build both the company and the product? I talk to a lot of tech companies that say that goal No. 1 is just build a product people love. Period. And then figure out the business model, the privacy policy, everything else from there, but you start with just, "let's make a thing that people like." But you, at the very beginning, have this added constraint of saying, "privacy is value No. 1." How does that change what kind of product you want to build?

I think this is something that has been changing quite a lot in the past decade. More and more today, when people make decisions of what to buy and what companies they want to support, it's really a decision not so much based on a product. You do need to have good products, but more than that, it's really also philosophy and shared values, right? When people buy a Tesla, for example, it is probably objectively in many ways a better car than the gas cars. But in 2008, when Tesla was first starting out, that probably wasn't the case. People bought in not so much due to the product, but because of the sense of values and what the company stood for, and having a connection between their own values and the values of the product that they're buying.

I think long term, product is important. You must have a good product, but more important than that, I think businesses need to have an alignment of values with their customers, because it's really about the relationship between you and your users. And that relationship needs to be built on trust, built on transparency and built on a shared set of ideals.

But that doesn't solve for having a worse product, right? It does seem like it's a tiebreaker, in a lot of ways. But if you have a terrible email client, nobody's going to use it.

Yes, yes. In the long run, product also must be there. But if I look at a lot of the revolutionary companies that have come through in the past couple of decades, some of the most successful ones didn't really always win on product. They eventually won on product. But they first really enter the space through values, and different kinds of values and different, you know, ideas of how business should be done.

In the course of building this product, how much are features and privacy at odds with each other? There are all kinds of things you could build that would obviously be delightful user experiences and a total privacy nightmare. And then in reverse, you could build things that are great for privacy and really keep people's stuff safe, but are just awful to use. How do you navigate that tension?

Yes, there is in fact a tension there. But what I've also discovered is as time goes on, technology gets better. As phones and the internet gets faster, you're able to, you know, resolve some of these issues. So for example, historically, there are many things that just couldn't be done on mobile devices, because they were too slow, and didn't have enough resources. Or the internet wasn't fast enough.

But today, the amount of computing power that people have on an iPhone or an Android phone is enormous, right? So the differences between a mobile and a desktop experience, even from a pure compute standpoint, is narrowing. And this allows us, from a technology standpoint, to do a lot of things that we previously couldn't do client-side.

Give me an example of a feature that falls into that.

Just encrypting data on the client side is one very basic thing, right? You know, encryption on the client side takes resources, it takes CPU power. And if you took devices from, say, a decade ago, they just wouldn't have the speed to properly handle that encryption clients-side, you had to do a server side. End-to-end encryption today is largely possible because devices have gotten a lot faster.

And there's more things: A lot of people are using AI on data to make predictive autocompletes and things like that. Today, it's kind of unimaginable to do that client-side. But given the rate in which processing power is increasing, it becomes more easy to imagine that in five, six years, maybe doing that on a mobile device is not out of the question.

And that seems like kind of universally a win for privacy. The more you can do client-side, the less you have to worry about. Right?

Yes, yes. Privacy is about decentralizing. You don't want to have one person that is the gatekeeper, that holds the keys and control of all your data. You want to distribute that. You want to reallocate that back to who it belongs to, which is the user.

I was going to talk about this later, but since you just brought it up: I've been talking to a lot of people about the Signal/WhatsApp dustup who say, basically, it's good for privacy that people are leaving a Facebook property and going to Signal. But fundamentally, it's the same proposition, in that you're still trusting someone. There is still a person or company on the other end that you are putting your faith in. And what they all say is that this is why the future is decentralized and blockchain and zero-trust systems. And it sounds like you agree with that. Is that the future of ProtonMail, too? Is ProtonCoin coming someday to fund all of this stuff?

We're not doing any kind of ICO. I don't think that's something that we find to be the optimum way to raise funds.

There's been a lot of talk and discussion about how blockchain can revolutionize all sorts of industries. But my viewpoint on this tends to be a bit more cautious. There is a tendency to try to position blockchain as a solution to everything. But oftentimes, it is not the best solution for many problems. Even though it's new, it's fancy, it's cool and it's popular, that doesn't mean it's the best way. So I caution people to always kind of think that through and see what works and what doesn't work.

Blockchain technology, while it might seem quite mature, in many ways is still quite young. There's still a lot of problems to be solved. Can I picture a day when everything goes completely decentralized? Yeah, I can imagine that happening. But I think it will take a lot longer than people anticipate.

Do you agree that in theory, decentralization is the right next step to keep privacy working? I could see you making a case like, "We're scientists in Switzerland, we're as trustworthy as anybody!" Do you see the idea of "trust nobody" as the obvious next correct answer?

So trust is kind of an interesting topic. At Proton, a lot of what we do, the way that we structure our development process, the fact that everything is open-source and audited, the way that we handle security and key management, introducing features like key pinning, a lot of it is based around the idea of making things as trustless as possible.

But, fundamentally — and this might be controversial — I don't actually believe there is such thing as a completely trustless system. Because at the end of the day everything, even decentralized software, is built by people. And that is one component that I don't see us removing any time. So it doesn't matter what you do, what system you use, you ultimately need to trust somebody. Unless you go in there and read every line of code yourself and verify yourself. Which is something nice to consider in theory, but in practice, not very possible.

That's actually a good segue into the other thing I want to talk about, which is this broader privacy debate, because I think the thing that you just described is very much not a part of how we talk about privacy regulation. Especially the question of, do you make a backdoor to let the good guys in so that you can regulate what people are talking about and properly moderate your platforms, which obviously a lot of people want, versus giving people true privacy, which has downsides. And you have a very obvious point of view on this, which is you're pro-privacy, but can you walk me through what is happening in this fight right now?

It's an extremely complicated issue. It's very tough to wrap your mind around all this … and politicians certainly haven't managed to do it. Maybe the best way to discuss this topic is just to share my views of how I look at it. Not everybody agrees with me.

But we basically have a choice, right? You can either live in a world with privacy, or you can live in a world without privacy. There is no doubt in my mind that the world without privacy will be safer. There'll be no crime, because everybody is tracked. There'll be no terrorism, because you know who's the terrorist, even before the committed terrorist act. And you will have probably absolute safety, whatever that means.

So the question we have to ask is, what is the world we want to live in? Do you want to live in a world without privacy, where we have some sort of safety? Or do we want to live in a world where we have freedom of thought, freedom of expression, free speech and all the other things that come with ensuring privacy. And by and large, if you think about it from just that perspective, most people would agree that they want to live in a role that does have privacy.

But I think most people instinctively believe there has to be a middle ground there somewhere, right? Where I can have the best of both worlds? And it seems like what we're increasingly finding is no, these two things are as mutually exclusive as you're describing.

It's natural for us to try to find a middle ground. And that is the political struggle right now: to say, what is the middle ground?

The way I look at it is you have to look concretely at the proposals. The EU recently came out with a message. They avoided the word "backdoor," but what they're essentially asking for is some sort of backdoor. And any technical person that looks at that will tell you that these ideas are unworkable. They either completely compromise the security and the privacy of the tools, or they are relying on technology that doesn't yet exist. So I'm not saying that a middle ground cannot someday be found. We don't need to always be living on the two extremes. But we need to also carefully assess these proposals on their technical merits, and see if they make sense. And the proposal that we're fighting against is one that simply doesn't make sense.

Privacy companies like us, we're not the enemy, right? We are actually highly incentivized to crack down on abuse and crack down on criminal misuse of our services. If a terrorist is going to use my service, they're certainly not going to leave their credit card, they're not going to leave their address. That is a very poor-performing business segment, right. Our interests are actually aligned with governments' interest in ensuring public safety.

On the flip side, if we decide to throw in on the side of privacy and free speech and free thought, like you're talking about, there does seem to be some acceptance of bad things that just has to happen. If you want to preserve privacy, you are preserving privacy for bad guys. Right? I think you're right that that is probably a smaller portion of the whole than people like to talk about, but it's there. And it's real.

Yeah, of course, And the way to look at this, we've seen that terrorists use airplanes, but airplanes are also very, very important in connecting the world and making modern society function. So we need to look at what is the overall social good, right? We tolerate airplanes, despite their occasional use by terrorists, because there is an overall benefit to society from airplanes existing.

Encryption is the same thing: There is an overall benefit to society that outweighs the occasional risk. And, of course, we can take measures to try to prevent bad people from misusing technology. But just like you wouldn't build an airplane that is less safe in order to prevent terrorists from using it, I wouldn't want to build encryption that is less safe in order to prevent the bad guys from using it.

To totally stretch and possibly ruin this metaphor, the solution we came up with for that was TSA and security screenings. What is the encryption version of that? I think what the government argues is that the encryption version of that is a backdoor for law enforcement, where you let somebody pay attention even though most people can't pay attention. Is there a better answer to that question than what they're proposing?

We want to avoid a situation where you have mass surveillance, where they say "we want to be able to break into everybody's information, and be able to see everything, just in case you're guilty." I am not opposed to law enforcement being able to go after people, but it needs to be done in a targeted way that doesn't put the general population at risk. And this is a balance that actually can be struck.

Today, we can get a court order to begin recording IP addresses and turn over logs from specific users. And that is a targeted measure that requires court approval, and that's OK. And so I think the balance can be struck. We want to have police, but not a police state.

What do you make of what Apple is doing, with the privacy labels and things? We're in this interesting position, where absent these regulations, these big companies get to dictate how privacy works, at least for a while. And Apple seems to be the one pushing the hardest. Do you feel like that's a good sign that Apple is betting on privacy this way?

Well, Apple's a very interesting case. They are promoting privacy. But are they promoting privacy because they believe in privacy? Or are they promoting privacy as a way to lock other players and strengthen their own monopoly rights?

What a great question!

And in Apple's case, if you look at historically, the positions they take, it's pretty clear to me that they care more about their revenues than users themselves. I'll give you a very basic example: Apple actually is the only big tech company that does business in China. You can say everything you want to say about Facebook and Google and their questionable business practices. But even Facebook and Google draw the line in engaging in China, because it was ethically and morally something that they couldn't tolerate. They didn't want to be complicit with the actions of the Chinese government. But Apple didn't have a problem with that. They saw the revenue in that.

Apple does have a very strong privacy brand, but if you were to map out kind of their actions and what they're willing and not willing to do, I would say that they clearly put revenue and business interests ahead of maybe the interests of people. And that's, you know, that worries me, especially as I see them trying to consolidate more power and shrinking their monopoly.

The way that you think about all this makes me think that if you wanted to, you could get into building almost anything. Do you have dreams of a Proton smartphone or a Proton browser? If you start with the idea that with privacy, you can build something both better and different, that applies to everything in tech right now, right?

In order to really guarantee privacy, you need to be in more areas than just email, VPN calendar and file storage, right? It's actually the entire ecosystem of applications and services that exists today, they could all be rebuilt and reimagined in a privacy-focused way.

And in the long term, given that our mission is to provide, you know, privacy and security to everybody that wants it, if we continue to have success, we will inevitably need to go to other sectors. And that's probably the end game. But I'm also excited to see that the ecosystem around privacy is growing. Companies like DuckDuckGo, which is very successful in search. You have companies like Brave, which is now active in the browser space — of course, they're still based on ads, so I don't fully agree with that model, but they are at least doing something better than Chrome. And you also see Signal and chats.

I don't think it's possible for one company to do it all. And we certainly wouldn't aspire to do that. But I do think that a parallel internet built on different values is starting to coalesce and develop, and it will be many players jumping into this space in the next five to 10 years. So that 10 years from now, it would actually be possible to have a completely private internet existence.


Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep ReadingShow less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep ReadingShow less
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep ReadingShow less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep ReadingShow less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.


Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep ReadingShow less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories