Enterprise

Mark Zuckerberg’s metaverse will require computing tech no one knows how to build

To achieve anything close to what metaverse boosters promise, experts believe nearly every kind of chip will have to be an order of magnitude more powerful than it is today.

Chips on chips

A disconnect has formed between the way corporate America is talking about the metaverse and its plausibility.

Illustration: Christopher T. Fong/Protocol

The technology necessary to power the metaverse doesn’t exist.

It will not exist next year. It will not exist in 2026. The technology might not exist in 2032, though it’s likely we will have a few ideas as to how we might eventually design and manufacture chips that could turn Mark Zuckerberg’s fever dreams into reality by then.

Over the past six months, a disconnect has formed between the way corporate America is talking about the dawning concept of the metaverse and its plausibility, based on the nature of the computing power that will be necessary to achieve it. To get there will require immense innovation, similar to the multi-decade effort to shrink personal computers to the size of an iPhone.

Microsoft hyped its $68.7 billion bid for Activision Blizzard last month as a metaverse play. In October, Facebook transformed its entire corporate identity to revolve around the metaverse. Last year, Disney even promised to build its own version of the metaverse to “allow storytelling without boundaries.”

These ideas hinge on our ability to build the chips, data centers and networking equipment needed to deliver the computing horsepower required. And at the moment, we can’t. No one knows how, or where to start, or even whether the devices will still be semiconductors. There aren’t enough chips right now to build all the things people want today, let alone what’s promised by metaverse preachers.

“The biggest things that we are looking at in supercomputers today still need to be improved in order to be able to deliver [a metaverse] type of experience,” Jerry Heinz, the former head of Nvidia’s Enterprise Cloud unit, told Protocol.

Zuckerversed

What we now describe as the metaverse is at least as old as early 20th century speculative fiction.

E.M. Forster’s 1909 story “The Machine Stops,” for example, renders a pre-chip, pre-digital version of the metaverse. Fast forward 70 years, and science-fiction writer William Gibson called this concept “cyberspace” in the 1984 book “Neuromancer”; Neal Stephenson popularized the word “metaverse” in his 1992 novel “Snow Crash”; Ernest Cline called it OASIS (an acronym for Ontologically Anthropocentric Sensory Immersive Simulation) in “Ready Player One.” Few of those stories describe a utopian community.

It’s possible that what we now call the metaverse will forever remain the domain of science fiction. But like it or not, Mark Zuckerberg has vaulted the idea into the mainstream.

Zuckerberg’s explanation of what the metaverse will ultimately look like is vague, but includes some of the tropes its boosters roughly agree on: He called it “[an] embodied internet that you’re inside of rather than just looking at” that would offer everything you can already do online and “some things that don’t make sense on the internet today, like dancing.”

If the metaverse sounds vague, that’s because it is. That description could mutate over time to apply to lots of things that might eventually happen in technology. And arguably, something like the metaverse might eventually already exist in an early form produced by video game companies.

Roblox and Epic Games’ Fortnite play host to millions — albeit in virtually separated groups of a few hundred people — viewing live concerts online. Microsoft Flight Simulator has created a 2.5 petabyte virtual replica of the world that is updated in real time with flight and weather data.

But even today’s most complex metaverse-like video games require a tiny fraction of the processing and networking performance we would need to achieve the vision of a persistent world accessed by billions of people, all at once, across multiple devices, screen formats and in virtual or augmented reality.

“For something that is a true mass market, spend-many-hours-a-day doing [kind of activity, we’re looking] at generations of compute to leap forward to do that,” Creative Strategies CEO Ben Bajarin told Protocol. “What you’re going to see over the next few years is an evolution to what you see today, with maybe a bit more emphasis on AR than VR. But it’s not going to be this rich, simulated 3D environment.”

A generational leap

In the beginning, chips powered mainframes. Mainframes begat servers, home computers and smartphones: smaller, faster and cheaper versions of more or less the same technology that came before.

If the metaverse is next, nobody can describe the system requirements specifically because it will be a distinct departure from prior shifts in computing. But it has become clear that to achieve anything close to the optimistic version, chips of nearly every kind will have to be an order of magnitude more powerful than they are today.

Intel’s Raja Koduri took a stab at the question in a recent editorial, writing: “Truly persistent and immersive computing, at scale and accessible by billions of humans in real time, will require even more: a 1,000-times increase in computational efficiency from today’s state of the art.”

It’s difficult to understate how challenging it will be to reach the goal of a thousandfold increase in computing efficiency. Koduri’s estimate might be convservative, and the demands could easily exceed 10 times that amount.

Even assuming those onerous hardware requirements can be met, better communication between all layers of the software stack — from chips at the bottom to end-user applications at the top — will also be required, University of Washington computer science professor Pedro Domingos told Protocol.

“We can get away with [inefficiency] now, but we’re not going to get away with it in the metaverse,” he said. “The whole [software] stack is going to be more tightly integrated, and this is already happening in areas such as AI and, of course, graphics.”

It’s not quantum computing

The generational leap toward the metaverse probably won’t be quantum computing, or at least how we think of it today: a largely theoretical platform decades from practical use that requires calculations to be performed at outer-space vacuum temperatures in room-sized computers. But the performance breakthrough promised by something like quantum computing will be necessary.

Google is exploring using algorithms to design more powerful chips, which could help move the needle. Special-purpose processors for AI models exist today, but by creating even more specialized chips, it’s possible to eke out more performance, Domingos said. Those designs can circumvent roadblocks to increasing the raw performance of existing silicon, such as making an application-specific integrated circuit that performs physics calculations.

“These companies — the chip-makers, or the providers of the metaverse, or who knows — will make more and more advanced chips for this purpose,” Domingos said. “For every level of the stack, from the physics to the software, there are things you can do.”

Domingos noted that, in the 1990s, ray tracing in real time would have been considered impossible, yet decades later it’s now done in real time with chips that power the PlayStation 5 and Xbox Series X. Google’s AI chips, known as tensor processing units, are another example of a specialized type of chip that will only become more abundant in the future, and is necessary for the metaverse.

A fabulous future

But generational shifts in computing also require equivalent shifts in manufacturing technology. Companies such as TSMC and Intel are already pushing the boundaries of physics with extreme ultraviolet lithography machines to print the most advanced chips.

The latest EUV machines are dedicated to squeezing larger numbers of ever-smaller transistors and features onto each chip, continuing down the path that has been established for decades. But at some point in the future, the chip-making machines will become too costly, or it will be impossible to shrink features any further.

“If you look at where the architecture stands, if you look at where the performance per watt stands, I don’t want to say we need a breakthrough, but we’re pretty close to needing a breakthrough,” Bajarin said. “Sub-one nanometer is roughly four or five years away, and that’s not going to solve this problem.”

Without a generational leap in computing, a lower-fidelity version of the Zuckerverse is attainable. Assuming users will settle for graphics somewhat better than Second Life was able to achieve a decade ago, it should be possible in the longer run to make something that achieves some of the goals, such as a persistent, internet-connected virtual world. Building that version of the metaverse will require better networking tech, the specialized chips Domingos described and possibly something like artificial intelligence computing in order to handle some of the more complex but mundane workloads.

“There’s a lot of scaling up to do, which means that today’s data centers are going to look miniscule compared with the ones of tomorrow,” Domingos said.

But it’s going to take a long time to get there. Zuckerberg’s vision of the metaverse could be decades away, and after losing $20 billion on the effort so far, it's not clear Meta will have the cash to turn that vision into reality.

Entertainment

To clear the FTC, Microsoft’s Activision deal might require compromise

The FTC is in the process of reviewing the biggest-ever gaming acquisition. Here’s how it could change the Xbox business.

Will the Microsoft acquisition of Activision get through the FTC?

Image: Microsoft; Protocol

Microsoft’s planned acquisition of Activision Blizzard is the largest-ever deal in the video game market by a mile. With a sale price of $68.7 billion, the deal is nearly 450% larger than Grand Theft Auto publisher Take-Two Interactive’s acquisition of Zynga in January, the next-largest game acquisition ever recorded.

The eye-popping price underlines the scale and scope of Microsoft’s ambitions for its gaming business: If the deal is approved, Microsoft would own — alongside its current major properties, such as Halo and Minecraft — Warcraft, Overwatch and Call of Duty, to name just a few. In turn, the deal has invited a rare level of scrutiny and attention from lawmakers and policy professionals now turning their sights on an industry that’s flown under the regulatory radar for the last several decades of its existence.

Keep Reading Show less
Nick Statt

Nick Statt is Protocol's video game reporter. Prior to joining Protocol, he was news editor at The Verge covering the gaming industry, mobile apps and antitrust out of San Francisco, in addition to managing coverage of Silicon Valley tech giants and startups. He now resides in Rochester, New York, home of the garbage plate and, completely coincidentally, the World Video Game Hall of Fame. He can be reached at nstatt@protocol.com.

Sponsored Content

Why the digital transformation of industries is creating a more sustainable future

Qualcomm’s chief sustainability officer Angela Baker on how companies can view going “digital” as a way not only toward growth, as laid out in a recent report, but also toward establishing and meeting environmental, social and governance goals.

Three letters dominate business practice at present: ESG, or environmental, social and governance goals. The number of mentions of the environment in financial earnings has doubled in the last five years, according to GlobalData: 600,000 companies mentioned the term in their annual or quarterly results last year.

But meeting those ESG goals can be a challenge — one that businesses can’t and shouldn’t take lightly. Ahead of an exclusive fireside chat at Davos, Angela Baker, chief sustainability officer at Qualcomm, sat down with Protocol to speak about how best to achieve those targets and how Qualcomm thinks about its own sustainability strategy, net zero commitment, other ESG targets and more.

Keep Reading Show less
Chris Stokel-Walker

Chris Stokel-Walker is a freelance technology and culture journalist and author of "YouTubers: How YouTube Shook Up TV and Created a New Generation of Stars." His work has been published in The New York Times, The Guardian and Wired.

Enterprise

Okta CEO: 'We should have done a better job' with the Lapsus$ breach

In an interview with Protocol, Okta CEO Todd McKinnon said the cybersecurity firm could’ve done a lot of things better after the Lapsus$ breach of a third-party support provider earlier this year.

From talking to hundreds of customers, “I've had a good sense of the sentiment and the frustrations,” McKinnon said.

Photo: David Paul Morris via Getty Images

Okta co-founder and CEO Todd McKinnon agrees with you: Disclosing a breach that impacts customer data should not take months.

“If that happens in January, customers can't be finding out about it in March,” McKinnon said in an interview with Protocol.

Keep Reading Show less
Kyle Alspach

Kyle Alspach ( @KyleAlspach) is a senior reporter at Protocol, focused on cybersecurity. He has covered the tech industry since 2010 for outlets including VentureBeat, CRN and the Boston Globe. He lives in Portland, Oregon, and can be reached at kalspach@procotol.com.

Policy

Ethereum's co-founder thinks the blockchain can fix social media

But before the blockchain can fix social media, someone has to fix the blockchain. Frank McCourt, who’s put serious money behind his vision of a decentralized social media future, thinks Gavin Wood may be the key.

Gavin Wood, co-founder of Ethereum and creator of Polkadot, is helping Frank McCourt's decentralized social media initiative.

Photo: Jason Crowley

Frank McCourt, the billionaire mogul who is donating $100 million to help build decentralized alternatives to the social media giants, has picked a partner to make the blockchain work at Facebook scale: Ethereum co-founder Gavin Wood.

McCourt’s Project Liberty will work with the Web3 Foundation’s Polkadot project, it said Tuesday. Wood launched Polkadot in 2020 after leaving Ethereum. Project Liberty has a technical proposal to allow users to retain their data on a blockchain as they move among future social media services. Wood’s involvement is to give the idea a shot at actually working at the size and speed of a popular social network.

Keep Reading Show less
Ben Brody

Ben Brody (@ BenBrodyDC) is a senior reporter at Protocol focusing on how Congress, courts and agencies affect the online world we live in. He formerly covered tech policy and lobbying (including antitrust, Section 230 and privacy) at Bloomberg News, where he previously reported on the influence industry, government ethics and the 2016 presidential election. Before that, Ben covered business news at CNNMoney and AdAge, and all manner of stories in and around New York. He still loves appearing on the New York news radio he grew up with.

Fintech

Gensler: Bitcoin may be a commodity

The SEC has been vague about crypto. But Gensler said bitcoin is a commodity, “maybe.” It’s the clearest glimpse of his views on digital assets yet.

“Bitcoin — maybe that’s a commodity token. That has a big market value, but that goes over there,” Gensler said, referring to another regulator, the CFTC.

Photoillustration: Al Drago/Bloomberg via Getty Images; Protocol

SEC Chair Gary Gensler has long argued that many cryptocurrencies are subject to regulation as securities.

But he recently clarified that this view wouldn’t apply to the best-known cryptocurrency, bitcoin.

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.

Latest Stories
Bulletins