The chiplet boomlet
Hello and welcome to Protocol Enterprise! Today: How a modular chip design approach reinvigorated AMD and might have altered chipmaking strategies across the industry, inside the latest White House summit on the cybersecurity talent gap, and Oracle database users have an easier path to Microsoft Azure.
Every chip in its right place
For years the conventional wisdom dictated that to make computer chips bigger, better, faster, stronger, designers needed to pack more features onto an increasingly large piece of silicon. The idea was that by bringing together more parts of a system onto a single chip, semiconductor companies could deliver a predictable performance gain every two years. Until they couldn’t.
- Bigger pieces of silicon — referred to as “dies” — created a number of problems for chipmakers.
- Fewer, larger dies on each silicon wafer printed by a manufacturer increases the chances of a defect occurring.
- The blueprint used to print a chip onto a wafer cannot produce chips larger than about 850 square millimeters, among other issues.
AMD was the first of the chip giants to change course. Building on a concept called “chiplets” that dates back to at least 1965, the company’s engineers developed a design that breaks apart each big chip into smaller pieces.
- Chiplets solved a lot of problems for AMD all at once, and most importantly gave the company a way to compete with Intel in the server market. In 2015, when AMD began to pursue commercializing chiplets, Intel dominated the market.
- The cost of producing chips with a chiplet-based design lowered costs by 40%, and it allowed AMD to make an entire lineup of server products without redesigning a chip for each end market.
- Chiplets also created a way for AMD to make a chiplet-based desktop processor, which was the company’s most profitable market at the time.
Nvidia has taken a decidedly different approach to chipmaking. At the core, chiplets don’t make sense for the kinds of tasks the company’s chips are good for: primarily video game graphics and AI-related computing.
- GPUs are really good at performing thousands and thousands of relatively simple calculations simultaneously, which makes them well suited to render video game graphics or to train AI models.
- To overcome the big die problem, Nvidia chose to go even bigger. It developed technology used to connect multiple large-die graphics chips.
- To Nvidia Vice President Ian Buck, the ultimate expression of that idea is the forthcoming Grace Hopper product, which fuses an Arm-based, Nvidia-designed CPU to one of its GPUs.
SPONSORED CONTENT FROM ALIBABA
How global ecommerce benefits American workers and the U.S. economy: Alibaba — a leading global ecommerce company — is a particularly powerful engine in helping American businesses of every size sell goods to more than 1 billion consumers on its digital marketplaces in China. In 2020, U.S. companies completed more than $54 billion of sales to consumers in China through Alibaba’s online platforms.
Widening the pool
When it comes to the shortfall in entry-level jobs in cybersecurity, the issue appears to be getting wider acknowledgment in both the private sector and the federal government.
Top Biden administration officials and industry executives convened Tuesday at the White House to tackle the cybersecurity talent and diversity gap. And according to two executives who attended the summit, there was plenty of discussion around how staffing and hiring practices need to change around entry-level cybersecurity roles.
In remarks by top administration officials and breakout sessions at the White House summit, significant attention was placed on the need to "use entry-level positions as a means of getting … a broader set of talent" into the cybersecurity field, Akamai chief security officer Boaz Gelbord told me. In other words, there was an acknowledgment that "it's not just a supply problem,” Gelbord said.
Valerie Abend, who heads global financial services security at Accenture, said she brought up the need for many employers to reassess the job requirements they have for entry-level cybersecurity roles. Unrealistic expectations for entry-level security jobs has often been cited as a major barrier both to widening the talent pool and increasing diversity. Ultimately at the summit, "I think what you heard was, all of this does involve the private sector," Abend told me.
The summit wasn't all talk, though. The Labor and Commerce departments on Tuesday announced a new initiative to support greater use of cybersecurity apprenticeships — which are often meant to lead to jobs — by employers.
— Kyle Alspach (email | twitter)Around the enterprise
Oracle customers clinging to their databases who want to run their applications in Microsoft Azure have a new integration service that promises to eliminate a lot of the custom work required, the companies announced.
Russian hacking groups are increasingly using public cloud storage services like Dropbox and Google Drive to deliver malware, according to Palo Alto Networks researchers.
SPONSORED CONTENT FROM ALIBABA
How global ecommerce benefits American workers and the U.S. economy: Using economic multipliers published by the U.S. Bureau of Economic Analysis, NDP estimates that the ripple effect of this Alibaba-fueled consumption in 2020 supported more than 256,000 U.S. jobs and $21 billion in wages. These American sales to Chinese consumers also added $39 billion to U.S. GDP.
Thanks for reading — see you tomorrow!
Recent Issues
In a tough economy, benefits of the cloud 'only magnify'
November 14, 2022
Twitter’s security leads just quit. Now what?
November 10, 2022
Intel finally serves up a chip
November 09, 2022
The great AI race that wasn’t
November 08, 2022
Cloudflare sets a target
November 07, 2022
How Elon will bring back the Fail Whale
November 04, 2022
See more
To give you the best possible experience, this site uses cookies. If you continue browsing. you accept our use of cookies. You can review our privacy policy to find out more about the cookies we use.