Proton’s plan for a more private internet

Andy Yen joins the Source Code podcast to talk about building privacy-focused apps, the encryption backdoor debate and whether Apple is actually fighting for the user.

ProtonMail app

ProtonMail's encrypted email was just the beginning for Proton Technologies.

Photo: ProtonMail

The team behind ProtonMail didn't necessarily start out to build an email client. They started out with a big idea: that privacy online is worth preserving, and that it's possible to build great products that preserve that privacy. They started with email, CEO Andy Yen said, because in the 21st century "your email address is your online identity." Since then, the Proton team has built a VPN, a calendar, a file storage system, and has much more on the roadmap.

Yen joined the Source Code Podcast to talk about Proton's story, what it takes to build products with privacy as a first-class feature and why that's getting easier to do all the time. He also talked about the debate over encryption backdoors for law enforcement, whether Apple's push for privacy is really about doing right by users and the other privacy-focused products the internet needs most.

Subscribe to the show: Apple Podcasts | Spotify | Google Podcasts | RSS

The following excerpts from our interview have been edited for length and clarity.

In the early days of ProtonMail, how does thinking about privacy change the way that you start and build both the company and the product? I talk to a lot of tech companies that say that goal No. 1 is just build a product people love. Period. And then figure out the business model, the privacy policy, everything else from there, but you start with just, "let's make a thing that people like." But you, at the very beginning, have this added constraint of saying, "privacy is value No. 1." How does that change what kind of product you want to build?

I think this is something that has been changing quite a lot in the past decade. More and more today, when people make decisions of what to buy and what companies they want to support, it's really a decision not so much based on a product. You do need to have good products, but more than that, it's really also philosophy and shared values, right? When people buy a Tesla, for example, it is probably objectively in many ways a better car than the gas cars. But in 2008, when Tesla was first starting out, that probably wasn't the case. People bought in not so much due to the product, but because of the sense of values and what the company stood for, and having a connection between their own values and the values of the product that they're buying.

I think long term, product is important. You must have a good product, but more important than that, I think businesses need to have an alignment of values with their customers, because it's really about the relationship between you and your users. And that relationship needs to be built on trust, built on transparency and built on a shared set of ideals.

But that doesn't solve for having a worse product, right? It does seem like it's a tiebreaker, in a lot of ways. But if you have a terrible email client, nobody's going to use it.

Yes, yes. In the long run, product also must be there. But if I look at a lot of the revolutionary companies that have come through in the past couple of decades, some of the most successful ones didn't really always win on product. They eventually won on product. But they first really enter the space through values, and different kinds of values and different, you know, ideas of how business should be done.

In the course of building this product, how much are features and privacy at odds with each other? There are all kinds of things you could build that would obviously be delightful user experiences and a total privacy nightmare. And then in reverse, you could build things that are great for privacy and really keep people's stuff safe, but are just awful to use. How do you navigate that tension?

Yes, there is in fact a tension there. But what I've also discovered is as time goes on, technology gets better. As phones and the internet gets faster, you're able to, you know, resolve some of these issues. So for example, historically, there are many things that just couldn't be done on mobile devices, because they were too slow, and didn't have enough resources. Or the internet wasn't fast enough.

But today, the amount of computing power that people have on an iPhone or an Android phone is enormous, right? So the differences between a mobile and a desktop experience, even from a pure compute standpoint, is narrowing. And this allows us, from a technology standpoint, to do a lot of things that we previously couldn't do client-side.

Give me an example of a feature that falls into that.

Just encrypting data on the client side is one very basic thing, right? You know, encryption on the client side takes resources, it takes CPU power. And if you took devices from, say, a decade ago, they just wouldn't have the speed to properly handle that encryption clients-side, you had to do a server side. End-to-end encryption today is largely possible because devices have gotten a lot faster.

And there's more things: A lot of people are using AI on data to make predictive autocompletes and things like that. Today, it's kind of unimaginable to do that client-side. But given the rate in which processing power is increasing, it becomes more easy to imagine that in five, six years, maybe doing that on a mobile device is not out of the question.

And that seems like kind of universally a win for privacy. The more you can do client-side, the less you have to worry about. Right?

Yes, yes. Privacy is about decentralizing. You don't want to have one person that is the gatekeeper, that holds the keys and control of all your data. You want to distribute that. You want to reallocate that back to who it belongs to, which is the user.

I was going to talk about this later, but since you just brought it up: I've been talking to a lot of people about the Signal/WhatsApp dustup who say, basically, it's good for privacy that people are leaving a Facebook property and going to Signal. But fundamentally, it's the same proposition, in that you're still trusting someone. There is still a person or company on the other end that you are putting your faith in. And what they all say is that this is why the future is decentralized and blockchain and zero-trust systems. And it sounds like you agree with that. Is that the future of ProtonMail, too? Is ProtonCoin coming someday to fund all of this stuff?

We're not doing any kind of ICO. I don't think that's something that we find to be the optimum way to raise funds.

There's been a lot of talk and discussion about how blockchain can revolutionize all sorts of industries. But my viewpoint on this tends to be a bit more cautious. There is a tendency to try to position blockchain as a solution to everything. But oftentimes, it is not the best solution for many problems. Even though it's new, it's fancy, it's cool and it's popular, that doesn't mean it's the best way. So I caution people to always kind of think that through and see what works and what doesn't work.

Blockchain technology, while it might seem quite mature, in many ways is still quite young. There's still a lot of problems to be solved. Can I picture a day when everything goes completely decentralized? Yeah, I can imagine that happening. But I think it will take a lot longer than people anticipate.

Do you agree that in theory, decentralization is the right next step to keep privacy working? I could see you making a case like, "We're scientists in Switzerland, we're as trustworthy as anybody!" Do you see the idea of "trust nobody" as the obvious next correct answer?

So trust is kind of an interesting topic. At Proton, a lot of what we do, the way that we structure our development process, the fact that everything is open-source and audited, the way that we handle security and key management, introducing features like key pinning, a lot of it is based around the idea of making things as trustless as possible.

But, fundamentally — and this might be controversial — I don't actually believe there is such thing as a completely trustless system. Because at the end of the day everything, even decentralized software, is built by people. And that is one component that I don't see us removing any time. So it doesn't matter what you do, what system you use, you ultimately need to trust somebody. Unless you go in there and read every line of code yourself and verify yourself. Which is something nice to consider in theory, but in practice, not very possible.

That's actually a good segue into the other thing I want to talk about, which is this broader privacy debate, because I think the thing that you just described is very much not a part of how we talk about privacy regulation. Especially the question of, do you make a backdoor to let the good guys in so that you can regulate what people are talking about and properly moderate your platforms, which obviously a lot of people want, versus giving people true privacy, which has downsides. And you have a very obvious point of view on this, which is you're pro-privacy, but can you walk me through what is happening in this fight right now?

It's an extremely complicated issue. It's very tough to wrap your mind around all this … and politicians certainly haven't managed to do it. Maybe the best way to discuss this topic is just to share my views of how I look at it. Not everybody agrees with me.

But we basically have a choice, right? You can either live in a world with privacy, or you can live in a world without privacy. There is no doubt in my mind that the world without privacy will be safer. There'll be no crime, because everybody is tracked. There'll be no terrorism, because you know who's the terrorist, even before the committed terrorist act. And you will have probably absolute safety, whatever that means.

So the question we have to ask is, what is the world we want to live in? Do you want to live in a world without privacy, where we have some sort of safety? Or do we want to live in a world where we have freedom of thought, freedom of expression, free speech and all the other things that come with ensuring privacy. And by and large, if you think about it from just that perspective, most people would agree that they want to live in a role that does have privacy.

But I think most people instinctively believe there has to be a middle ground there somewhere, right? Where I can have the best of both worlds? And it seems like what we're increasingly finding is no, these two things are as mutually exclusive as you're describing.

It's natural for us to try to find a middle ground. And that is the political struggle right now: to say, what is the middle ground?

The way I look at it is you have to look concretely at the proposals. The EU recently came out with a message. They avoided the word "backdoor," but what they're essentially asking for is some sort of backdoor. And any technical person that looks at that will tell you that these ideas are unworkable. They either completely compromise the security and the privacy of the tools, or they are relying on technology that doesn't yet exist. So I'm not saying that a middle ground cannot someday be found. We don't need to always be living on the two extremes. But we need to also carefully assess these proposals on their technical merits, and see if they make sense. And the proposal that we're fighting against is one that simply doesn't make sense.

Privacy companies like us, we're not the enemy, right? We are actually highly incentivized to crack down on abuse and crack down on criminal misuse of our services. If a terrorist is going to use my service, they're certainly not going to leave their credit card, they're not going to leave their address. That is a very poor-performing business segment, right. Our interests are actually aligned with governments' interest in ensuring public safety.

On the flip side, if we decide to throw in on the side of privacy and free speech and free thought, like you're talking about, there does seem to be some acceptance of bad things that just has to happen. If you want to preserve privacy, you are preserving privacy for bad guys. Right? I think you're right that that is probably a smaller portion of the whole than people like to talk about, but it's there. And it's real.

Yeah, of course, And the way to look at this, we've seen that terrorists use airplanes, but airplanes are also very, very important in connecting the world and making modern society function. So we need to look at what is the overall social good, right? We tolerate airplanes, despite their occasional use by terrorists, because there is an overall benefit to society from airplanes existing.

Encryption is the same thing: There is an overall benefit to society that outweighs the occasional risk. And, of course, we can take measures to try to prevent bad people from misusing technology. But just like you wouldn't build an airplane that is less safe in order to prevent terrorists from using it, I wouldn't want to build encryption that is less safe in order to prevent the bad guys from using it.

To totally stretch and possibly ruin this metaphor, the solution we came up with for that was TSA and security screenings. What is the encryption version of that? I think what the government argues is that the encryption version of that is a backdoor for law enforcement, where you let somebody pay attention even though most people can't pay attention. Is there a better answer to that question than what they're proposing?

We want to avoid a situation where you have mass surveillance, where they say "we want to be able to break into everybody's information, and be able to see everything, just in case you're guilty." I am not opposed to law enforcement being able to go after people, but it needs to be done in a targeted way that doesn't put the general population at risk. And this is a balance that actually can be struck.

Today, we can get a court order to begin recording IP addresses and turn over logs from specific users. And that is a targeted measure that requires court approval, and that's OK. And so I think the balance can be struck. We want to have police, but not a police state.

What do you make of what Apple is doing, with the privacy labels and things? We're in this interesting position, where absent these regulations, these big companies get to dictate how privacy works, at least for a while. And Apple seems to be the one pushing the hardest. Do you feel like that's a good sign that Apple is betting on privacy this way?

Well, Apple's a very interesting case. They are promoting privacy. But are they promoting privacy because they believe in privacy? Or are they promoting privacy as a way to lock other players and strengthen their own monopoly rights?

What a great question!

And in Apple's case, if you look at historically, the positions they take, it's pretty clear to me that they care more about their revenues than users themselves. I'll give you a very basic example: Apple actually is the only big tech company that does business in China. You can say everything you want to say about Facebook and Google and their questionable business practices. But even Facebook and Google draw the line in engaging in China, because it was ethically and morally something that they couldn't tolerate. They didn't want to be complicit with the actions of the Chinese government. But Apple didn't have a problem with that. They saw the revenue in that.

Apple does have a very strong privacy brand, but if you were to map out kind of their actions and what they're willing and not willing to do, I would say that they clearly put revenue and business interests ahead of maybe the interests of people. And that's, you know, that worries me, especially as I see them trying to consolidate more power and shrinking their monopoly.

The way that you think about all this makes me think that if you wanted to, you could get into building almost anything. Do you have dreams of a Proton smartphone or a Proton browser? If you start with the idea that with privacy, you can build something both better and different, that applies to everything in tech right now, right?

In order to really guarantee privacy, you need to be in more areas than just email, VPN calendar and file storage, right? It's actually the entire ecosystem of applications and services that exists today, they could all be rebuilt and reimagined in a privacy-focused way.

And in the long term, given that our mission is to provide, you know, privacy and security to everybody that wants it, if we continue to have success, we will inevitably need to go to other sectors. And that's probably the end game. But I'm also excited to see that the ecosystem around privacy is growing. Companies like DuckDuckGo, which is very successful in search. You have companies like Brave, which is now active in the browser space — of course, they're still based on ads, so I don't fully agree with that model, but they are at least doing something better than Chrome. And you also see Signal and chats.

I don't think it's possible for one company to do it all. And we certainly wouldn't aspire to do that. But I do think that a parallel internet built on different values is starting to coalesce and develop, and it will be many players jumping into this space in the next five to 10 years. So that 10 years from now, it would actually be possible to have a completely private internet existence.

Who is Parag Agrawal, Twitter’s new CEO?

The main thing you need to know: He’s an engineer’s engineer.

Twitter’s new CEO is its current chief technology officer, Parag Agrawal.

Photo: Twitter

When Parag Agrawal was at Stanford writing his computer science thesis, his adviser couldn’t imagine that any of her students would become the CEO of one of the world’s most powerful social media companies.

But much has changed since Agrawal graduated with his doctorate in 2012. On Monday morning, Twitter announced that Jack Dorsey had resigned and that Chief Technology Officer Agrawal had been promoted to CEO, effective immediately.

Keep Reading Show less
Anna Kramer

Anna Kramer is a reporter at Protocol (Twitter: @ anna_c_kramer, email: akramer@protocol.com), where she writes about labor and workplace issues. Prior to joining the team, she covered tech and small business for the San Francisco Chronicle and privacy for Bloomberg Law. She is a recent graduate of Brown University, where she studied International Relations and Arabic and wrote her senior thesis about surveillance tools and technological development in the Middle East.

The Bureau of Labor Statistics indicates that by 2026, the shortage of engineers in the U.S. will exceed 1.2 million, while 545,000 software developers will have left the market by that time. Meanwhile, business is becoming increasingly more digital-first, and teams need the tools in place to keep distributed teams aligned and able to respond quickly to changing business needs. That means businesses need to build powerful workplace applications without relying on developers.

In fact, according to Gartner, by 2025, 70% of new applications developed by enterprises will use low-code or no-code technologies and, by 2023, there will be at least four times as many active citizen developers as professional developers at large enterprises. We're on the cusp of a big shift in how businesses operate and how organization wide innovation happens.

Keep Reading Show less
Andrew Ofstad
As Airtable’s co-founder, Andrew spearheads Airtable’s long-term product bets and represents the voice of the customer in major product decisions. After co-founding the company, he helped scale Airtable’s original product and engineering teams. He previously led the redesign of Google's flagship Maps product, and before that was a product manager for Android.
Protocol | Policy

Jack Dorsey and breaking up the cult of the founder

Dorsey’s farewell note is a warning shot to all founder CEOs … especially you-know-who.

“There aren’t many companies that get to this level. And there aren’t many founders that choose their company over their own ego.”

Photo: Getty Images

In his note Monday announcing his departure from Twitter, Jack Dorsey delivered a warm welcome to the company’s new CEO, a fond farewell to the tweeps he’s leaving behind and a quick shout-out to his mom.

He also fired a warning shot at certain other founder-CEOs who shall remain nameless.

Keep Reading Show less
Issie Lapowsky

Issie Lapowsky ( @issielapowsky) is Protocol's chief correspondent, covering the intersection of technology, politics, and national affairs. She also oversees Protocol's fellowship program. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University's Center for Publishing on how tech giants have affected publishing.

Protocol | Fintech

Twitter isn’t part of Jack Dorsey’s big bet on crypto

Bitcoin unleashed a huge wave, and Dorsey — no longer doing double duty at Twitter and Square — wants to ride it.

There’s still time for Square to expand its crypto footprint, though, which makes the timing of Dorsey’s move significant.

Photo: Joe Raedle/Getty Images

Jack Dorsey’s sudden exit from Twitter underlines the tech pioneer’s growing fixation with crypto — a passion that has forced a sudden resolution of the odd situation of a single individual leading two large tech companies.

It’s now clear that Square is Dorsey’s favorite child and needs all of his attention to advance the role it could play in popularizing bitcoin, the best-known cryptocurrency.

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Signal at (510)731-8429.

Protocol | Workplace

Best practices for your company VR retreat

Trello held a VR corporate retreat in April. Here's what it learned.

Trello held their annual "Trello Together" retreat in virtual reality in April.

Image: Trello

Night kayaking and rainforest zip-lining in Puerto Rico. Racing go-karts and launching cooking competitions. Building dog houses in the Arizona desert. These are some of the in-person activities Trello organized in the before-COVID times. Expectations were high when the company scheduled its annual corporate retreat for 2021, since this time it would be in a virtual world.

Trello has embraced remote work for most of its 10-year existence, according to co-founder Michael Pryor. Like many remote-work companies, Trello organized yearly retreats to turn co-workers into friends. Called "Trello Together," everyone would gather for three days to chat and have fun. With the pandemic, they had to get creative.

Keep Reading Show less
Lizzy Lawrence

Lizzy Lawrence ( @LizzyLaw_) is a reporter at Protocol, covering tools and productivity in the workplace. She's a recent graduate of the University of Michigan, where she studied sociology and international studies. She served as editor in chief of The Michigan Daily, her school's independent newspaper. She's based in D.C., and can be reached at llawrence@protocol.com.

Latest Stories