Enterprise

Mark Zuckerberg’s metaverse will require computing tech no one knows how to build

To achieve anything close to what metaverse boosters promise, experts believe nearly every kind of chip will have to be an order of magnitude more powerful than it is today.

Chips on chips

A disconnect has formed between the way corporate America is talking about the metaverse and its plausibility.

Illustration: Christopher T. Fong/Protocol

The technology necessary to power the metaverse doesn’t exist.

It will not exist next year. It will not exist in 2026. The technology might not exist in 2032, though it’s likely we will have a few ideas as to how we might eventually design and manufacture chips that could turn Mark Zuckerberg’s fever dreams into reality by then.

Over the past six months, a disconnect has formed between the way corporate America is talking about the dawning concept of the metaverse and its plausibility, based on the nature of the computing power that will be necessary to achieve it. To get there will require immense innovation, similar to the multi-decade effort to shrink personal computers to the size of an iPhone.

Microsoft hyped its $68.7 billion bid for Activision Blizzard last month as a metaverse play. In October, Facebook transformed its entire corporate identity to revolve around the metaverse. Last year, Disney even promised to build its own version of the metaverse to “allow storytelling without boundaries.”

These ideas hinge on our ability to build the chips, data centers and networking equipment needed to deliver the computing horsepower required. And at the moment, we can’t. No one knows how, or where to start, or even whether the devices will still be semiconductors. There aren’t enough chips right now to build all the things people want today, let alone what’s promised by metaverse preachers.

“The biggest things that we are looking at in supercomputers today still need to be improved in order to be able to deliver [a metaverse] type of experience,” Jerry Heinz, the former head of Nvidia’s Enterprise Cloud unit, told Protocol.

Zuckerversed

What we now describe as the metaverse is at least as old as early 20th century speculative fiction.

E.M. Forster’s 1909 story “The Machine Stops,” for example, renders a pre-chip, pre-digital version of the metaverse. Fast forward 70 years, and science-fiction writer William Gibson called this concept “cyberspace” in the 1984 book “Neuromancer”; Neal Stephenson popularized the word “metaverse” in his 1992 novel “Snow Crash”; Ernest Cline called it OASIS (an acronym for Ontologically Anthropocentric Sensory Immersive Simulation) in “Ready Player One.” Few of those stories describe a utopian community.

It’s possible that what we now call the metaverse will forever remain the domain of science fiction. But like it or not, Mark Zuckerberg has vaulted the idea into the mainstream.

Zuckerberg’s explanation of what the metaverse will ultimately look like is vague, but includes some of the tropes its boosters roughly agree on: He called it “[an] embodied internet that you’re inside of rather than just looking at” that would offer everything you can already do online and “some things that don’t make sense on the internet today, like dancing.”

If the metaverse sounds vague, that’s because it is. That description could mutate over time to apply to lots of things that might eventually happen in technology. And arguably, something like the metaverse might eventually already exist in an early form produced by video game companies.

Roblox and Epic Games’ Fortnite play host to millions — albeit in virtually separated groups of a few hundred people — viewing live concerts online. Microsoft Flight Simulator has created a 2.5 petabyte virtual replica of the world that is updated in real time with flight and weather data.

But even today’s most complex metaverse-like video games require a tiny fraction of the processing and networking performance we would need to achieve the vision of a persistent world accessed by billions of people, all at once, across multiple devices, screen formats and in virtual or augmented reality.

“For something that is a true mass market, spend-many-hours-a-day doing [kind of activity, we’re looking] at generations of compute to leap forward to do that,” Creative Strategies CEO Ben Bajarin told Protocol. “What you’re going to see over the next few years is an evolution to what you see today, with maybe a bit more emphasis on AR than VR. But it’s not going to be this rich, simulated 3D environment.”

A generational leap

In the beginning, chips powered mainframes. Mainframes begat servers, home computers and smartphones: smaller, faster and cheaper versions of more or less the same technology that came before.

If the metaverse is next, nobody can describe the system requirements specifically because it will be a distinct departure from prior shifts in computing. But it has become clear that to achieve anything close to the optimistic version, chips of nearly every kind will have to be an order of magnitude more powerful than they are today.

Intel’s Raja Koduri took a stab at the question in a recent editorial, writing: “Truly persistent and immersive computing, at scale and accessible by billions of humans in real time, will require even more: a 1,000-times increase in computational efficiency from today’s state of the art.”

It’s difficult to understate how challenging it will be to reach the goal of a thousandfold increase in computing efficiency. Koduri’s estimate might be convservative, and the demands could easily exceed 10 times that amount.

Even assuming those onerous hardware requirements can be met, better communication between all layers of the software stack — from chips at the bottom to end-user applications at the top — will also be required, University of Washington computer science professor Pedro Domingos told Protocol.

“We can get away with [inefficiency] now, but we’re not going to get away with it in the metaverse,” he said. “The whole [software] stack is going to be more tightly integrated, and this is already happening in areas such as AI and, of course, graphics.”

It’s not quantum computing

The generational leap toward the metaverse probably won’t be quantum computing, or at least how we think of it today: a largely theoretical platform decades from practical use that requires calculations to be performed at outer-space vacuum temperatures in room-sized computers. But the performance breakthrough promised by something like quantum computing will be necessary.

Google is exploring using algorithms to design more powerful chips, which could help move the needle. Special-purpose processors for AI models exist today, but by creating even more specialized chips, it’s possible to eke out more performance, Domingos said. Those designs can circumvent roadblocks to increasing the raw performance of existing silicon, such as making an application-specific integrated circuit that performs physics calculations.

“These companies — the chip-makers, or the providers of the metaverse, or who knows — will make more and more advanced chips for this purpose,” Domingos said. “For every level of the stack, from the physics to the software, there are things you can do.”

Domingos noted that, in the 1990s, ray tracing in real time would have been considered impossible, yet decades later it’s now done in real time with chips that power the PlayStation 5 and Xbox Series X. Google’s AI chips, known as tensor processing units, are another example of a specialized type of chip that will only become more abundant in the future, and is necessary for the metaverse.

A fabulous future

But generational shifts in computing also require equivalent shifts in manufacturing technology. Companies such as TSMC and Intel are already pushing the boundaries of physics with extreme ultraviolet lithography machines to print the most advanced chips.

The latest EUV machines are dedicated to squeezing larger numbers of ever-smaller transistors and features onto each chip, continuing down the path that has been established for decades. But at some point in the future, the chip-making machines will become too costly, or it will be impossible to shrink features any further.

“If you look at where the architecture stands, if you look at where the performance per watt stands, I don’t want to say we need a breakthrough, but we’re pretty close to needing a breakthrough,” Bajarin said. “Sub-one nanometer is roughly four or five years away, and that’s not going to solve this problem.”

Without a generational leap in computing, a lower-fidelity version of the Zuckerverse is attainable. Assuming users will settle for graphics somewhat better than Second Life was able to achieve a decade ago, it should be possible in the longer run to make something that achieves some of the goals, such as a persistent, internet-connected virtual world. Building that version of the metaverse will require better networking tech, the specialized chips Domingos described and possibly something like artificial intelligence computing in order to handle some of the more complex but mundane workloads.

“There’s a lot of scaling up to do, which means that today’s data centers are going to look miniscule compared with the ones of tomorrow,” Domingos said.

But it’s going to take a long time to get there. Zuckerberg’s vision of the metaverse could be decades away, and after losing $20 billion on the effort so far, it's not clear Meta will have the cash to turn that vision into reality.

Workplace

The tools that make you pay for not getting stuff done

Some tools let you put your money on the line for productivity. Should you bite?

Commitment contracts are popular in a niche corner of the internet, and the tools have built up loyal followings of people who find the extra motivation effective.

Photoillustration: Anna Shvets/Pexels; Protocol

Danny Reeves, CEO and co-founder of Beeminder, is used to defending his product.

“When people first hear about it, they’re kind of appalled,” Reeves said. “Making money off of people’s failure is how they view it.”

Keep Reading Show less
Lizzy Lawrence

Lizzy Lawrence ( @LizzyLaw_) is a reporter at Protocol, covering tools and productivity in the workplace. She's a recent graduate of the University of Michigan, where she studied sociology and international studies. She served as editor in chief of The Michigan Daily, her school's independent newspaper. She's based in D.C., and can be reached at llawrence@protocol.com.

Sponsored Content

Foursquare data story: leveraging location data for site selection

We take a closer look at points of interest and foot traffic patterns to demonstrate how location data can be leveraged to inform better site selecti­on strategies.

Imagine: You’re the leader of a real estate team at a restaurant brand looking to open a new location in Manhattan. You have two options you’re evaluating: one site in SoHo, and another site in the Flatiron neighborhood. Which do you choose?

Keep Reading Show less

Elon Musk has bots on his mind.

Photo: Christian Marquardt/Getty Images

Elon Musk says he needs proof that less than 5% of Twitter's users are bots — or the deal isn't going ahead.

Keep Reading Show less
Jamie Condliffe

Jamie Condliffe ( @jme_c) is the executive editor at Protocol, based in London. Prior to joining Protocol in 2019, he worked on the business desk at The New York Times, where he edited the DealBook newsletter and wrote Bits, the weekly tech newsletter. He has previously worked at MIT Technology Review, Gizmodo, and New Scientist, and has held lectureships at the University of Oxford and Imperial College London. He also holds a doctorate in engineering from the University of Oxford.

Policy

Nobody will help Big Tech prevent online terrorism but itself

There’s no will in Congress or the C-suites of social media giants for a new approach, but smaller platforms would have room to step up — if they decided to.

Timothy Kujawski of Buffalo lights candles at a makeshift memorial as people gather at the scene of a mass shooting at Tops Friendly Market at Jefferson Avenue and Riley Street on Sunday, May 15, 2022 in Buffalo, NY. The fatal shooting of 10 people at a grocery store in a historically Black neighborhood of Buffalo by a young white gunman is being investigated as a hate crime and an act of racially motivated violent extremism, according to federal officials.

Photo: Kent Nishimura / Los Angeles Times via Getty Images

The shooting in Buffalo, New York, that killed 10 people over the weekend has put the spotlight back on social media companies. Some of the attack was livestreamed, beginning on Amazon-owned Twitch, and the alleged shooter appears to have written about how his racist motivations arose from misinformation on smaller or fringe sites including 4chan.

In response, policymakers are directing their anger at tech platforms, with New York Governor Kathy Hochul calling for the companies to be “more vigilant in monitoring” and for “a legal responsibility to ensure that such hate cannot populate these sites.”

Keep Reading Show less
Ben Brody

Ben Brody (@ BenBrodyDC) is a senior reporter at Protocol focusing on how Congress, courts and agencies affect the online world we live in. He formerly covered tech policy and lobbying (including antitrust, Section 230 and privacy) at Bloomberg News, where he previously reported on the influence industry, government ethics and the 2016 presidential election. Before that, Ben covered business news at CNNMoney and AdAge, and all manner of stories in and around New York. He still loves appearing on the New York news radio he grew up with.

We're answering all your questions about the crypto crash.

Photo: Chris Liverani/Unsplash

People started talking about another crypto winter in January, when falling prices had wiped out $1 trillion in value from November’s peak. Prices rallied back in March, restoring some of the losses. Then crypto fell hard again, with bitcoin down more than 60% from its all-time high and other cryptocurrencies harder hit. The market’s message was clear: Crypto winter was no longer coming. It’s here.

If you’ve got questions about the crypto crash, the Protocol Fintech team has answers.

Keep Reading Show less
Latest Stories
Bulletins