People

One man’s plan to build a new internet

Dfinity Chief Scientist Dominic Williams comes on the Source Code Podcast.

Dominic Williams

Dfinity's founder and chief scientist, Dominic Williams.

Photo: Dfinity

Much is wrong with the internet we have now. But what does better look like?

Dominic Williams, the founder and chief scientist at Dfinity, thinks he has an answer. It's called the Internet Computer, and it builds on top of the internet's most basic protocols to create a new generation of the web that doesn't exist on a bunch of private networks controlled by tech giants, but is run by the network itself. It's zero-trust and unhackable and yeah, you guessed it, it's blockchain. But blockchain that works "at web speed," Williams said.

Williams' idea isn't the only one about what it would take to reinvent the internet, but it's a serious player: Dfinity, a nonprofit, has raised $195 million to build what Williams hopes will be a hack-proof, monopoly-proof, totally free and open internet. He came on the Source Code Podcast to explain his vision, how it might work and what it'll take to take on the tech giants.


Subscribe to the show: Apple Podcasts | Spotify | Google Podcasts | Pocket Casts | RSS

Below are excerpts from our conversation, lightly edited for length and clarity.

I want to start by kind of defining the problem of the internet now. We talk a lot in this industry about the way the internet is broken, and it's about moderation and business models and identity and data collection and Facebook ruining democracy and all sorts of things. As you think about what needs to be solved about the way the internet works now, how do you think about it?

So the internet itself is one of mankind's greatest inventions and achievements. It's a public network, created by an open, decentralized protocol that combines millions of private networks to create this meta-network that connects everybody and everything. And the internet does a number of wonderful things. It's unstoppable: It was designed to withstand a nuclear strike, and that's a very valuable property.

Because it was a decentralized protocol, many independent parties were able to build out at scale rapidly. If you remember the 1990s, there was this sort of Cambrian explosion of internet service providers and backbone providers and so on. And what's most important is that it creates an open, permissionless environment. So let's say you and I create two competing websites. I can't pick up the phone to the owner of the internet and say, "Hey, if you slow down David's website, I'll give you some stock in my company," right? It created this amazing global free market, which has provided a very firm foundation for an enormous amount of innovation and economic growth.

However, the internet this far is only really a network. And if you want to build a service that you want to connect to the internet, that service itself must be built entirely on a proprietary stack. So today, people wanting to build services or enterprise systems, they'll get to an account with a cloud services provider like Amazon Web Services. And they'll install a whole load of traditional software building blocks on the instances or platform that they've rented: databases, web servers and so on. And that that's just how it's done today.

The Internet Computer believes, in various subtle ways, that has led to the internet becoming very monopolistic and fragile. And our solution is to extend the internet. So while today, the internet provides a public network that connects everybody and everything, tomorrow it's also going to be the platform that people build on.

In spirit, what you're describing is not necessarily all that different from what the internet was supposed to be when we were talking about it 25 or 30 years ago, right? I feel like everybody in the open source community is listening to this and sort of nodding furiously, like, "this is what I've been talking about for three decades!" Right?

Absolutely. The current internet ecosystem is antithetical to the ethos of the internet. And we want to enable the world to reinvent the internet ecosystem, reimagine it in a better way, through converting the internet into something that's more than just a network. Extending it so that as well as a network, the internet is a compute platform, and that people can build anything from a website and an enterprise system through to an internet service and DeFi just by writing code to the internet, where it's hosted within the protocol along with the data it processes. That enables people to build systems that are unstoppable and tamper-proof.

With that kind of high-level thinking in mind, give me the 5-year-old level version of what Dfinity is trying to build, and how the internet computer works.

Unfortunately, there's no really easy explanation because there's a lot of advanced computer science.

It's a series of tubes.

Yeah! But OK, highest level: So the internet itself is created by a protocol called IP. And this protocol is able to combine millions of privately operated networks to create this single public meta-network. That simplifies everything. My software that's recording sound and sending it to you only needs to know the IP address of the computer you're using. That's it, it's just like a telephone number.

So the Internet Computer is created by a protocol called ICP: Internet Computer Protocol. And ICP runs over the top of IP. It combines the compute capacity of special node machines that are run en masse by independent data centers around the world, and it combines that compute capacity to create a single public compute platform, which is seamless, and can scale out and has unbounded capacity.

It's a completely novel kind of compute platform. It even reimagines how software works in various significant ways: The platform is stoppable, it's tamper-proof, you don't need to protect things you built on the internet computer with a firewall. And you don't need to use any of the traditional legacy building blocks. You don't need a cloud service, you don't need a content distribution network, you don't need a database, don't need a web server, don't need web memcached. You literally just write your code to the internet.

It seems like, structurally, the simplest thing that it provides is to get all of that control that you're talking about out of the hands of a few companies and into the hands of the ecosystem in a broad way. Is that one of the organizing principles here?

It's true that the Internet Computer can be used to build open alternatives to big tech services that can out-compete them because of various advantages they have. But the Internet Computer was conceived just because it's technically possible, and is a superior way for humanity to build its compute infrastructure.

We have 7.8 billion people on this planet, and we can only sustain the lives of that many people through automation. Modern society and the modern world depends upon heavy computerization to exist. If you look at supermarkets, well, you've got zero-day inventory, there's a complex supply chain that moves produce almost directly from farms with minimal waypoints into the supermarkets, and so on. And generally now, families depend upon internet services for communication. When President Trump talks about banning WeChat, Chinese expats in America are terrified that they'd lose contact with people back home. This stuff just has to be unstoppable.

The internet itself was designed to be unstoppable because of the Cold War. And yet here we are 70 years later, and the network is robust, but the services we're connecting to it aren't. And they become more and more fragile not only because of the way they're built, but because of their concentration in the hands of a few big tech mega-monopolies.

So then why isn't your argument to nationalize the internet? Why aren't you advocating for my tax dollars to pay for data centers the same way that they pay for roads?

Well, it's not just about data centers. The problems go really deep. It's about the entire stack, and what the stack can and can't do. I don't believe hyperscale data centers are the way to go, I think it's really about pushing computation to the edge.

But the way it enables you to build internet services, in open form, is what really kind of provides a solution to the problems of Big Tech and mega-monopolies, because it changes the incentives and ways that you can provide new ways to win by designing systems in a more open way.

We first saw it on the Bitcoin ledger: There's little access to control scripts. Bitcoin was the first stateful decentralized network. And when I saw Bitcoin, I had a kind of epiphany. The Bitcoin ledger doesn't reside anywhere, it just lives in cyberspace. The whole world can agree on this ledger, and it's tamper-proof, there's no way of hacking it. Otherwise, obviously, somebody would do that because they could transfer billions of Bitcoins to themselves.

Now, a Bitcoin ledger has three columns, if you like, it's like a spreadsheet with three columns. The first column is the address. The second column is the balance of Bitcoins at the address. And the third column is an access control script, which you need to unlock to move the bitcoins. So where does that code live? Who's responsible for it? The answer is, it's really autonomous. It just lives in cyberspace.

And Ethereum obviously took that a step further: They took that ledger, and they swapped the order of the last two columns. So now you had address, script — which is now a smart contract, and it's Turing complete, so you can credit create vastly more things with it — and then balance of coins. And actually, the coins move between the scripts on Ethereum. So the Internet Computer, of course, is an evolution of blockchain. It's the world's first unbounded blockchain computer that can run at web speed and doesn't have capacity limitations. So it can scale out its compute capacity as needed. Which means that you can rethink how you rebuild everything. You can essentially build on cyberspace.

The thing that I keep coming back to is that we are in this place where to tear down the internet and rebuild it again just seems impossible! There are just so many entrenched players that getting in and trying to reinvent it from the inside just seems impossible. From your perspective, what does it look like to figure out what steps one and two of this process are supposed to be?

I think any major undertaking can feel impossible until you've succeeded. And I think that's the case here.

My view is that DeFi will replace traditional finance, because it has certain fundamental advantages. It's just difficult to see and comprehend that now because you look at this heavily entrenched industry that's protected by regulators. So, staring at the mountain, it seems inconceivable that it can be climbed, but climbed it will be.

Legacy, proprietary, closed Big Tech infrastructure and services won't just disappear. We're still running COBOL, right? So it's not just going to disappear. It doesn't work like that. But I think five years from now, there's going to be a huge amount of excitement, people are going to see mass-market, open internet services that are beginning to edge out big tech services in various areas. Ten years from now, it'll be sort of widely seen which way the wind is blowing. Twenty years from now, the open internet will be far, far bigger than the internet we have today.

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep ReadingShow less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep ReadingShow less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep ReadingShow less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep ReadingShow less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep ReadingShow less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins