yesMike MurphyNone
×

Get access to Protocol

Will be used in accordance with our Privacy Policy

I’m already a subscriber
Power

A NOAA supercomputer upgrade could improve US supply chains

The Cray supercomputers will triple the computing power at NOAA's disposal.

Big-rig truck driving under dark storm clouds

Being able to more reliably route supplies around storms could have tremendous value for some companies.

Photo: Getty Images

A huge performance upgrade to the supercomputers that predict American weather could enable companies to keep delivering their products even in the face of severe storms.

Prior to a storm, stores like Home Depot or Walmart want to get supplies of products people will want to the places that are going to be affected, and pause shipments when the weather hits.

Get what matters in tech, in your inbox every morning. Sign up for Source Code.

"Knowing where that would hit, they would be able to better quantify what to ship, where and when," Chris Caplice, executive director at the MIT Center for Transportation & Logistics, told Protocol.

But weather is difficult to predict: The sheer number of variables to control for in modeling the Earth makes it a daunting task. And the weather events that have the most profound effect on U.S. industry, like hurricanes, tend to have a large amount of uncertainty about them before they make landfall.

Now, the U.S. wants to take a real stab at finally — hopefully — giving us more-accurate weather reports with powerful new supercomputers.

The U.S. National Oceanic and Atmospheric Administration announced Feb. 20 that it would be upgrading the supercomputers it uses to model the Earth's weather patterns over the next two years. The new computers will initially be tasked with improving NOAA's medium-term weather forecasting abilities — for events from one day to 16 days out — as well as better modeling for severe regional weather, like hurricanes, Brian Gross, director of the Environmental Modeling Center at the National Weather Service, told Protocol.

For companies looking at the potential path of a hurricane and seeing it covers the entire state of Florida, that's not helpful, Caplice said. But if that path can be significantly narrowed with NOAA's new supercomputers, "retailers would love it," he said. That could have "tremendous value" for some companies, especially those that don't have a lot of slack in when they can deliver products, Caplice added.

NOAA's new machines from Cray will be in Virginia and Arizona, and each will have a capacity of 12 petaflops. For reference, the current fastest computer in the world clocks in at 148.6 petaflops, although the tenth-fastest machine can achieve roughly 18 petaflops — so NOAA's computers aren't that far off the pace.

The Cray supercomputers will triple the computing power at NOAA's disposal, the administration said. And they will be used to process data collected by the NWS from satellites, weather balloons, radar, buoys, and other sources above and on the Earth's surface.

"This is a pretty big upgrade for us," Gross said.

In the long term, the computers will help NOAA improve its forecasting models in three specific ways: increasing the model resolution, which means they can focus on smaller events like flash floods or thunderstorms; how complex the model is, and the different physical factors they can include in a single model; and the number of forecasts they can run at once to determine how confident NOAA can be about an upcoming weather incident.

NOAA's computational upgrades are actually to bring the U.S. "on par" with many of its peer agencies across the world, said David Michaud, NWS' director of the Office of Central Processing. He said that agencies like the U.K.'s Met Office and Europe's ECMWF have similar capabilities to what the U.S. will have, and that they all can work together. "Across the modeling community, there's a real collaborative feel, in terms of interacting and sharing best practices with science," Michaud said. "We're staying on par in terms of our contribution with the science in collaboration with the community."

Implementing massive supercomputers like these is not like setting up a new desktop computer. "Generally the transition between one contract to another takes about a two-year timeframe," Michaud said. But in about a year's time, if everything goes according to plan, NOAA developers will start transitioning code over to the new system, checking it's all configured correctly. And if that goes off without a hitch, Michaud said, the computers should be fully operational by February 2022.

Get in touch with us: Share information securely with Protocol via encrypted Signal or WhatsApp message, at 415-214-4715 or through our anonymous SecureDrop.

Given that weather information is so critical for businesses that rely on rapid supply chains, some companies will be eagerly awaiting the results.

"If you know, for example, the upper plains of the Mississippi is going to flood, then companies can take advantage of that, and maybe advanced-ship or reroute," Caplice said. "I can see in weather disasters it could be very helpful."

Protocol | Fintech

Plaid’s COO is riding fintech’s choppy waves

He's a striking presence on the beach. If he navigates Plaid's data challenges, Eric Sager will loom large in the financial world as well.

Plaid COO Eric Sager is an avid surfer.

Photo: Plaid

Eric Sager is an avid surfer. It's a fitting passion for the No. 2 executive at Plaid, a startup that's riding fintech's rough waters — including a rogue wave on the horizon that could cause a wipeout.

As Plaid's chief operating officer, Sager has been helping the startup navigate that choppiness, from an abandoned merger with Visa to a harsh critique by the CEO of a top Wall Street bank.

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Signal at (510)731-8429.

Sponsored Content

The future of computing at the edge: an interview with Intel’s Tom Lantzsch

An interview with Tom Lantzsch, SVP and GM, Internet of Things Group at Intel

An interview with Tom Lantzsch

Senior Vice President and General Manager of the Internet of Things Group (IoT) at Intel Corporation

Edge computing had been on the rise in the last 18 months – and accelerated amid the need for new applications to solve challenges created by the Covid-19 pandemic. Tom Lantzsch, Senior Vice President and General Manager of the Internet of Things Group (IoT) at Intel Corp., thinks there are more innovations to come – and wants technology leaders to think equally about data and the algorithms as critical differentiators.

In his role at Intel, Lantzsch leads the worldwide group of solutions architects across IoT market segments, including retail, banking, hospitality, education, industrial, transportation, smart cities and healthcare. And he's seen first-hand how artificial intelligence run at the edge can have a big impact on customers' success.

Protocol sat down with Lantzsch to talk about the challenges faced by companies seeking to move from the cloud to the edge; some of the surprising ways that Intel has found to help customers and the next big breakthrough in this space.

What are the biggest trends you are seeing with edge computing and IoT?

A few years ago, there was a notion that the edge was going to be a simplistic model, where we were going to have everything connected up into the cloud and all the compute was going to happen in the cloud. At Intel, we had a bit of a contrarian view. We thought much of the interesting compute was going to happen closer to where data was created. And we believed, at that time, that camera technology was going to be the driving force – that just the sheer amount of content that was created would be overwhelming to ship to the cloud – so we'd have to do compute at the edge. A few years later – that hypothesis is in action and we're seeing edge compute happen in a big way.

Keep Reading Show less
Saul Hudson
Saul Hudson has a deep knowledge of creating brand voice identity, especially in understanding and targeting messages in cutting-edge technologies. He enjoys commissioning, editing, writing, and business development, in helping companies to build passionate audiences and accelerate their growth. Hudson has reported from more than 30 countries, from war zones to boardrooms to presidential palaces. He has led multinational, multi-lingual teams and managed operations for hundreds of journalists. Hudson is a Managing Partner at Angle42, a strategic communications consultancy.
Power

Facebook’s transparency report shows what it wants you to see

For all of the granular detail about removing hate speech, Facebook leaves lots of questions unanswered.

Facebook's latest content moderation report reveals strides and setbacks.

Photo: Chesnot/Getty Images

On a call with reporters Thursday, Facebook's vice president of integrity, Guy Rosen, said that Facebook wants to "lead the industry in transparency." The call accompanied the release of Facebook's fourth-quarter content moderation report, which shares in granular detail the amount of content Facebook removed for various violations of its policies and why.

But what's become increasingly clear over the years that Facebook has published these reports is just how much the company leaves out. Also clear: Facebook hopes these reports will serve as a model for regulators to impose on the tech industry writ large.

Keep Reading Show less
Issie Lapowsky
Issie Lapowsky (@issielapowsky) is a senior reporter at Protocol, covering the intersection of technology, politics, and national affairs. Previously, she was a senior writer at Wired, where she covered the 2016 election and the Facebook beat in its aftermath. Prior to that, Issie worked as a staff writer for Inc. magazine, writing about small business and entrepreneurship. She has also worked as an on-air contributor for CBS News and taught a graduate-level course at New York University’s Center for Publishing on how tech giants have affected publishing. Email Issie.
Transforming 2021

Blockchain, QR codes and your phone: the race to build vaccine passports

Digital verification systems could give people the freedom to work and travel. Here's how they could actually happen.

One day, you might not need to carry that physical passport around, either.

Photo: CommonPass

There will come a time, hopefully in the near future, when you'll feel comfortable getting on a plane again. You might even stop at the lounge at the airport, head to the regional office when you land and maybe even see a concert that evening. This seemingly distant reality will depend upon vaccine rollouts continuing on schedule, an open-sourced digital verification system and, amazingly, the blockchain.

Several countries around the world have begun to prepare for what comes after vaccinations. Swaths of the population will be vaccinated before others, but that hasn't stopped industries decimated by the pandemic from pioneering ways to get some people back to work and play. One of the most promising efforts is the idea of a "vaccine passport," which would allow individuals to show proof that they've been vaccinated against COVID-19 in a way that could be verified by businesses to allow them to travel, work or relax in public without a great fear of spreading the virus.

Keep Reading Show less
Mike Murphy

Mike Murphy ( @mcwm) is the director of special projects at Protocol, focusing on the industries being rapidly upended by technology and the companies disrupting incumbents. Previously, Mike was the technology editor at Quartz, where he frequently wrote on robotics, artificial intelligence, and consumer electronics.

Power

They left Mozilla. Now, they’re spreading its gospel.

The leaders of major tech-reform projects have one company in common on their resumes: Mozilla.

"Whether you're a market person or an engineer or a lawyer like me, Mozilla is a home for doing the ethical work," says Amba Kak, a former policy adviser at Mozilla.

Photo: Amba Kak

In 2016, when Josh Aas left Mozilla after more than a decade, only about one-third of all websites were encrypted. Most people in most places on the internet were basically walking around naked: Anyone could see where they visited, what they wrote and who they were, if they tried hard enough.

In 2021, almost every U.S.-based website has a lovely green lock that indicates encryption. Most users might not know the difference, but everyone has a little more privacy and a lot more security than they did just a few years ago, due in large part to Aas. Aas left Mozilla in 2016 not because he was bored or because he wanted a higher salary, but because during his time at Mozilla, he grew so obsessed with privacy and security that he couldn't let the work of encrypting the internet go undone. In 2013 he founded Let's Encrypt, a nonprofit that helps millions of sites encrypt for free, and now, Aas gets much of the credit for making real what seemed impossible when he started (and Mozilla gets the rest).

Keep Reading Show less
Anna Kramer

Anna Kramer is a reporter at Protocol (@ anna_c_kramer), where she helps write and produce Source Code, Protocol's daily newsletter. Prior to joining the team, she covered tech and small business for the San Francisco Chronicle and privacy for Bloomberg Law. She is a recent graduate of Brown University, where she studied International Relations and Arabic and wrote her senior thesis about surveillance tools and technological development in the Middle East.

Latest Stories