Power

Nvidia's $40B bet on AI, edge computing and the data center of the future

With its purchase of chip designer Arm, Nvidia hopes to build on its recent success selling machine-learning chips for data centers and define the next decade of enterprise computing.

Nvidia's CEO Jensen Huang

"These types of computing platforms are going to start changing in shapes and sizes, although the architecture is going to be very similar," says Nvidia CEO Jensen Huang.

Photo: Patrick T. Fallon/Bloomberg via Getty Images

Nvidia's landmark purchase of chip design stalwart Arm will have an immediate impact on the mobile market, but the long-term payoff from the deal is likely to come from the enterprise.

The $40 billion deal, coming just four years after Softbank Group's 2016 acquisition of the U.K. chip designer, sets the stage for Nvidia's growing ambitions in enterprise computing. Nvidia rose to prominence building graphics chips for powerful gaming PCs, but in recent years has ridden a wave of interest in data center chips that can tackle complex machine learning algorithms to new heights.

Now Nvidia wants to play an even larger role in the data center. It now has the potential to design chips with a combination of server processing power and machine-learning aptitude that could set the stage for the next era of cloud computing. Emerging concepts like edge computing demand chips designed for difficult operating environments far from spacious data centers, and edge devices will also produce new, massive data sets that will require artificial intelligence technology to capture and analyze.

"These types of computing platforms are going to start changing in shapes and sizes, although the architecture is going to be very similar," said Nvidia CEO Jensen Huang on a conference call following the announcement of the deal. He was referring not just to data centers, but autonomous vehicles, robotics and high-performance computing applications that will require new types of chips designed for those unique situations.

If you own a smartphone, you almost certainly use a chip based around Arm's designs or its underlying technology. Arm doesn't actually make chips, but it designs and licenses core chip technology to other companies, like Apple, Qualcomm and Samsung, that add their own bells and whistles and contract with third-party manufacturers to produce processors for mobile phones.

With this acquisition, Nvidia is also betting that it can extend that model further into the data center. Arm's designs are known for their energy efficiency, which for years was not as prominent a concern inside the server farms that run the internet as compared to performance. But as the costs of operating cloud computing and in-house data centers have skyrocketed thanks to the energy and cooling systems required to operate millions of servers, companies are starting to think differently about the balance of price, performance and power consumption when making purchasing decisions.

It has taken a long time for Arm server chips to achieve performance parity with those designed and built by Intel, which is just one of the reasons why Intel currently enjoys more than 90% market share in the data center. That market stake generated $38.4 billion in revenue in 2019, and that's a big target for its rivals.

However, 2019 was also the first year during which cloud giant AWS started allowing customers to rent computing power running on a custom server processor designed around Arm's technology. A second-generation chip introduced earlier this year offers a "40% improvement on cost/performance ratio" compared to Intel chips available through AWS, according to the cloud provider.

Should AWS prove real demand for Arm server processors exists in the cloud, Microsoft and Google — both of which have chip design expertise in house — are likely to follow suit. Intel's well-documented challenges advancing its chipmaking technology have also opened the door for rivals with alternative products.

Over the long term, however, the marriage of Nvidia's AI technology and Arm's processing expertise is the real key to the deal. This is also one of the major concerns about the deal, that Nvidia's corporate interests will cast a shadow over Arm's road map, bending it toward designs that advance Nvidia's strategy at the expense of Arm's current third-party licensees.

"Every 15 years, the computer industry goes through a strategic inflection point, or as Jefferies U.S. semiconductors analyst Mark Lipacis calls it, a tectonic shift, that dramatically transforms the computing model and realigns the leadership of the industry," wrote Sparq Capital's Michael Bruck, a former Intel executive, in post earlier this month on Venturebeat discussing the potential Nvidia-Arm deal.

The last 15 years have been dominated by the rise of smartphones and cloud computing as the two main forces in modern computing, best exemplified by Apple's Arm-powered iPhone and AWS' Intel-powered cloud operation. Nvidia's $40 billion gamble assumes that the next 15 years will require both edge computing devices and massive data centers to add specialized machine-learning chips to serve new applications such as self-driving cars, which won't be able to tolerate processing delays caused by underpowered chips or round trips to far-flung data centers.

"In time, there will be trillions of these small autonomous computers, powered by AI, connected to massively powerful cloud data centers in every corner of the world," Huang said. If Nvidia can bypass the concerns of the U.K. tech industry and pull off this deal without alienating the massive ecosystem of Arm partners, it will be the chip company for that era.

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep Reading Show less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep Reading Show less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep Reading Show less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep Reading Show less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep Reading Show less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins