​Intel's Ocotillo campus in Chandler, Ariz., has four chipmaking plants known as fabs.
Photo: Intel Corp.

Software is eating the chip industry

Protocol Enterprise

Hello and welcome to Protocol Enterprise! Today: why chipmakers are struggling to recruit the talent they need to expand U.S. production, Google Cloud sets up a new public-sector division and Intel improves its optics.

If you build it, will they come?

In order to deliver hundreds of billions of dollars’ worth of expansion the chip industry is going to have to do a lot of hiring over the next five years. Twenty-seven thousand people, to put a number on it, according to research from Georgetown’s Center for Security and Emerging Technology.

  • As manufacturing in the U.S. has waned, software companies have gobbled up most of the engineering talent that might otherwise have been interested in chips overall.

The talent required to staff the chip factories of the future has triggered something of a fight between industry heavyweights such as Intel and TSMC, both of which are racing to build or expand operations in Arizona.

  • Intel, in some ways, has a home field advantage in the U.S. It has been operating since the 1960s and has developed ties to schools such as Arizona State University and Maricopa County community colleges.
  • There are mixed reports about TSMC’s early hiring efforts, as reports have emerged that some Americans are having trouble adjusting to the company’s culture.
  • TSMC said that’s not the case, citing a training program that has brought 500 people from Arizona to complete a six-month training stint in Taiwan before returning to the U.S.

It’s not just semiconductor manufacturing companies that will require new engineering talent. Chip companies such as Nvidia, Qualcomm and AMD design chips but have outsourced manufacturing to the likes of Samsung and TSMC.

  • To design its latest chip that enables 5G wireless communication MediaTek devoted 5,000 people to the effort, for example.
  • The number of people needed to design future advanced chips is only going to increase, and these people do not yet exist either, like the engineers required for manufacturing.
  • “For chip design companies, it's becoming increasingly harder to meet the need for a lot more people; chip architectures are much more complicated and to sustain the rate of innovation required just for established companies is challenging,” MediaTek government affairs head Patrick Wilson told Protocol.
  • To wit, Wilson said that the demand for chips is rising by 50% for a second year in a row, but there hasn’t been a corresponding increase in the number of graduate students, which are vital for chip design.

— Max A. Cherney (email | twitter)

A MESSAGE FROM LOGITECH

Hybrid work success looks different depending on who you ask. Your company is made up of a cast of players, each with a role critical to a competitive and thriving business, and with an eye on their North Star: employee happiness. How do you appease all those stakeholders?

Learn more

Google Cloud heads to D.C.

Google has created a new separate division to double down on Google Cloud’s work within the U.S. public sector, including local, state and federal governments and educational institutions.

Google Public Sector will operate as a Google subsidiary with its own board, consistent with other technology companies’ government divisions, Google Cloud CEO Thomas Kurian said in a blog post today.

Will Grannis, Google Cloud’s chief technology officer, will lead Google Public Sector until a permanent CEO is named. Lynn Martin, vice president of the North American public sector at Google Cloud, will lead the new division’s go-to-market efforts.

“Google Public Sector will provide a full complement of business functions and capabilities, including specialized sales, customer engineering, customer success and services, customer support, channel and partner programs, compliance and security operations, so that our U.S. public sector customers can leverage the full range of technology offerings from Google Cloud,” Kurian said.

Those capabilities previously were handled within Google Cloud, whose U.S. federal government work includes contracts with the Air Force, Navy, the Department of Defense’s (DoD) Defense Innovation Unit, the Department of Veterans Affairs, the Department of Energy and the U.S. Patent and Trademark Office.

“With Google Public Sector, our plan is to continue down the path of achieving the highest levels of U.S. government certifications and requirements possible,” Kurian said. “This means the division will have the capability to manage sensitive government data.”

At the federal government level, that will include security clearances. Google Cloud is vying to win part of the potentially $9 billion Joint Warfighting Cloud Capability contract that the DoD expects to award to multiple cloud providers in December. Employee concerns over working with the military led Google to take itself out of the running for the ultimately doomed JEDI contract prior to Kurian’s arrival.

— Donna Goodison (email | twitter)

Light-speed data centers

Intel is well known for its devotion to Moore’s Law. But there is no such thing as Moore’s Law when it comes to optics, a field of research that is crucial to the underlying goal of moving bits very, very quickly around a data center.

Broadly, this technology is called photonics, and it can theoretically move data as fast as light travels. On Tuesday, Intel said it had made a significant improvement toward developing a laser array that greatly increases the amount of communications bandwidth that’s achievable when optically interconnecting processors.

The new laser array is an important iteration of photonics tech for two reasons: First, it demonstrates Intel’s ability to shrink a much larger component down to microchip-level scale. And second, Intel says that it can produce the new tech at high volume, which is typically the rub with just about any chip-related development given that data centers need thousands of chips.

With Tuesday’s announcement, companies involved in data center networking such as Broadcom, HPE, Marvell and Nvidia head into the July Fourth holiday a little weaker than they were on Monday.


— Max A. Cherney (email | twitter)

Around the enterprise

Databricks made several product announcements at its developer conference, including the newest generation of its data lakehouse builder and Project Lightspeed, a bid to improve the performance of the Apache Spark open-source project.


Our colleagues at Protocol Climate took a look at data center operations in very dry places like Arizona, and how data center operators are trying to use water-cooled systems more efficiently.

A MESSAGE FROM LOGITECH

Rightsizing, where each meeting space is outfitted for a specific purpose, is top of mind for facilities pros. Reconfiguring rooms to support new hybrid work schedules enables personalization and a safe return to the office. Understanding how employees will use spaces as they come back will be critical for success.

Learn more

Thanks for reading — see you tomorrow!

Recent Issues