Source Code: What matters in tech, in your inbox every morning

×
×

Sign up for Protocol Cloud — Tom Krazit's weekly newsletter on the future of cloud & enterprise computing.

Not today, thank you!

Will be used in accordance with our Privacy Policy

"These types of computing platforms are going to start changing in shapes and sizes, although the architecture is going to be very similar," says Nvidia CEO Jensen Huang.

Photo: Patrick T. Fallon/Bloomberg via Getty Images
Nvidia's CEO Jensen Huang
Power

Nvidia's $40B bet on AI, edge computing and the data center of the future

With its purchase of chip designer Arm, Nvidia hopes to build on its recent success selling machine-learning chips for data centers and define the next decade of enterprise computing.

Nvidia's landmark purchase of chip design stalwart Arm will have an immediate impact on the mobile market, but the long-term payoff from the deal is likely to come from the enterprise.

The $40 billion deal, coming just four years after Softbank Group's 2016 acquisition of the U.K. chip designer, sets the stage for Nvidia's growing ambitions in enterprise computing. Nvidia rose to prominence building graphics chips for powerful gaming PCs, but in recent years has ridden a wave of interest in data center chips that can tackle complex machine learning algorithms to new heights.

Sign up for Protocol newsletters

Now Nvidia wants to play an even larger role in the data center. It now has the potential to design chips with a combination of server processing power and machine-learning aptitude that could set the stage for the next era of cloud computing. Emerging concepts like edge computing demand chips designed for difficult operating environments far from spacious data centers, and edge devices will also produce new, massive data sets that will require artificial intelligence technology to capture and analyze.

"These types of computing platforms are going to start changing in shapes and sizes, although the architecture is going to be very similar," said Nvidia CEO Jensen Huang on a conference call following the announcement of the deal. He was referring not just to data centers, but autonomous vehicles, robotics and high-performance computing applications that will require new types of chips designed for those unique situations.

If you own a smartphone, you almost certainly use a chip based around Arm's designs or its underlying technology. Arm doesn't actually make chips, but it designs and licenses core chip technology to other companies, like Apple, Qualcomm and Samsung, that add their own bells and whistles and contract with third-party manufacturers to produce processors for mobile phones.

With this acquisition, Nvidia is also betting that it can extend that model further into the data center. Arm's designs are known for their energy efficiency, which for years was not as prominent a concern inside the server farms that run the internet as compared to performance. But as the costs of operating cloud computing and in-house data centers have skyrocketed thanks to the energy and cooling systems required to operate millions of servers, companies are starting to think differently about the balance of price, performance and power consumption when making purchasing decisions.

It has taken a long time for Arm server chips to achieve performance parity with those designed and built by Intel, which is just one of the reasons why Intel currently enjoys more than 90% market share in the data center. That market stake generated $38.4 billion in revenue in 2019, and that's a big target for its rivals.

However, 2019 was also the first year during which cloud giant AWS started allowing customers to rent computing power running on a custom server processor designed around Arm's technology. A second-generation chip introduced earlier this year offers a "40% improvement on cost/performance ratio" compared to Intel chips available through AWS, according to the cloud provider.

Should AWS prove real demand for Arm server processors exists in the cloud, Microsoft and Google — both of which have chip design expertise in house — are likely to follow suit. Intel's well-documented challenges advancing its chipmaking technology have also opened the door for rivals with alternative products.

Over the long term, however, the marriage of Nvidia's AI technology and Arm's processing expertise is the real key to the deal. This is also one of the major concerns about the deal, that Nvidia's corporate interests will cast a shadow over Arm's road map, bending it toward designs that advance Nvidia's strategy at the expense of Arm's current third-party licensees.

"Every 15 years, the computer industry goes through a strategic inflection point, or as Jefferies U.S. semiconductors analyst Mark Lipacis calls it, a tectonic shift, that dramatically transforms the computing model and realigns the leadership of the industry," wrote Sparq Capital's Michael Bruck, a former Intel executive, in post earlier this month on Venturebeat discussing the potential Nvidia-Arm deal.

The last 15 years have been dominated by the rise of smartphones and cloud computing as the two main forces in modern computing, best exemplified by Apple's Arm-powered iPhone and AWS' Intel-powered cloud operation. Nvidia's $40 billion gamble assumes that the next 15 years will require both edge computing devices and massive data centers to add specialized machine-learning chips to serve new applications such as self-driving cars, which won't be able to tolerate processing delays caused by underpowered chips or round trips to far-flung data centers.

"In time, there will be trillions of these small autonomous computers, powered by AI, connected to massively powerful cloud data centers in every corner of the world," Huang said. If Nvidia can bypass the concerns of the U.K. tech industry and pull off this deal without alienating the massive ecosystem of Arm partners, it will be the chip company for that era.

Latest Stories