Power

A NOAA supercomputer upgrade could improve US supply chains

The Cray supercomputers will triple the computing power at NOAA's disposal.

Big-rig truck driving under dark storm clouds

Being able to more reliably route supplies around storms could have tremendous value for some companies.

Photo: Getty Images

A huge performance upgrade to the supercomputers that predict American weather could enable companies to keep delivering their products even in the face of severe storms.

Prior to a storm, stores like Home Depot or Walmart want to get supplies of products people will want to the places that are going to be affected, and pause shipments when the weather hits.

Get what matters in tech, in your inbox every morning. Sign up for Source Code.

"Knowing where that would hit, they would be able to better quantify what to ship, where and when," Chris Caplice, executive director at the MIT Center for Transportation & Logistics, told Protocol.

But weather is difficult to predict: The sheer number of variables to control for in modeling the Earth makes it a daunting task. And the weather events that have the most profound effect on U.S. industry, like hurricanes, tend to have a large amount of uncertainty about them before they make landfall.

Now, the U.S. wants to take a real stab at finally — hopefully — giving us more-accurate weather reports with powerful new supercomputers.

The U.S. National Oceanic and Atmospheric Administration announced Feb. 20 that it would be upgrading the supercomputers it uses to model the Earth's weather patterns over the next two years. The new computers will initially be tasked with improving NOAA's medium-term weather forecasting abilities — for events from one day to 16 days out — as well as better modeling for severe regional weather, like hurricanes, Brian Gross, director of the Environmental Modeling Center at the National Weather Service, told Protocol.

For companies looking at the potential path of a hurricane and seeing it covers the entire state of Florida, that's not helpful, Caplice said. But if that path can be significantly narrowed with NOAA's new supercomputers, "retailers would love it," he said. That could have "tremendous value" for some companies, especially those that don't have a lot of slack in when they can deliver products, Caplice added.

NOAA's new machines from Cray will be in Virginia and Arizona, and each will have a capacity of 12 petaflops. For reference, the current fastest computer in the world clocks in at 148.6 petaflops, although the tenth-fastest machine can achieve roughly 18 petaflops — so NOAA's computers aren't that far off the pace.

The Cray supercomputers will triple the computing power at NOAA's disposal, the administration said. And they will be used to process data collected by the NWS from satellites, weather balloons, radar, buoys, and other sources above and on the Earth's surface.

"This is a pretty big upgrade for us," Gross said.

In the long term, the computers will help NOAA improve its forecasting models in three specific ways: increasing the model resolution, which means they can focus on smaller events like flash floods or thunderstorms; how complex the model is, and the different physical factors they can include in a single model; and the number of forecasts they can run at once to determine how confident NOAA can be about an upcoming weather incident.

NOAA's computational upgrades are actually to bring the U.S. "on par" with many of its peer agencies across the world, said David Michaud, NWS' director of the Office of Central Processing. He said that agencies like the U.K.'s Met Office and Europe's ECMWF have similar capabilities to what the U.S. will have, and that they all can work together. "Across the modeling community, there's a real collaborative feel, in terms of interacting and sharing best practices with science," Michaud said. "We're staying on par in terms of our contribution with the science in collaboration with the community."

Implementing massive supercomputers like these is not like setting up a new desktop computer. "Generally the transition between one contract to another takes about a two-year timeframe," Michaud said. But in about a year's time, if everything goes according to plan, NOAA developers will start transitioning code over to the new system, checking it's all configured correctly. And if that goes off without a hitch, Michaud said, the computers should be fully operational by February 2022.

Get in touch with us: Share information securely with Protocol via encrypted Signal or WhatsApp message, at 415-214-4715 or through our anonymous SecureDrop.

Given that weather information is so critical for businesses that rely on rapid supply chains, some companies will be eagerly awaiting the results.

"If you know, for example, the upper plains of the Mississippi is going to flood, then companies can take advantage of that, and maybe advanced-ship or reroute," Caplice said. "I can see in weather disasters it could be very helpful."

Policy

The Supreme Court’s EPA ruling is bad news for tech regulation, too

The justices just gave themselves a lot of discretion to smack down agency rules.

The ruling could also endanger work on competition issues by the FTC and net neutrality by the FCC.

Photo: Geoff Livingston/Getty Images

The Supreme Court’s decision last week gutting the Environmental Protection Agency’s ability to regulate greenhouse gas emissions didn’t just signal the conservative justices’ dislike of the Clean Air Act at a moment of climate crisis. It also served as a warning for anyone that would like to see more regulation of Big Tech.

At the heart of Chief Justice John Roberts’ decision in West Virginia v. EPA was a codification of the “major questions doctrine,” which, he wrote, requires “clear congressional authorization” when agencies want to regulate on areas of great “economic and political significance.”

Keep Reading Show less
Ben Brody

Ben Brody (@ BenBrodyDC) is a senior reporter at Protocol focusing on how Congress, courts and agencies affect the online world we live in. He formerly covered tech policy and lobbying (including antitrust, Section 230 and privacy) at Bloomberg News, where he previously reported on the influence industry, government ethics and the 2016 presidential election. Before that, Ben covered business news at CNNMoney and AdAge, and all manner of stories in and around New York. He still loves appearing on the New York news radio he grew up with.

Some of the most astounding tech-enabled advances of the next decade, from cutting-edge medical research to urban traffic control and factory floor optimization, will be enabled by a device often smaller than a thumbnail: the memory chip.

While vast amounts of data are created, stored and processed every moment — by some estimates, 2.5 quintillion bytes daily — the insights in that code are unlocked by the memory chips that hold it and transfer it. “Memory will propel the next 10 years into the most transformative years in human history,” said Sanjay Mehrotra, president and CEO of Micron Technology.

Keep Reading Show less
James Daly
James Daly has a deep knowledge of creating brand voice identity, including understanding various audiences and targeting messaging accordingly. He enjoys commissioning, editing, writing, and business development, particularly in launching new ventures and building passionate audiences. Daly has led teams large and small to multiple awards and quantifiable success through a strategy built on teamwork, passion, fact-checking, intelligence, analytics, and audience growth while meeting budget goals and production deadlines in fast-paced environments. Daly is the Editorial Director of 2030 Media and a contributor at Wired.
Enterprise

Microsoft and Google are still using emotion AI, but with limits

Microsoft said accessibility goals overrode problems with emotion recognition and Google offers off-the-shelf emotion recognition technology amid growing concern over the controversial AI.

Emotion recognition is a well established field of computer vision research; however, AI-based technologies used in an attempt to assess people’s emotional states have moved beyond the research phase.

Photo: Microsoft

Microsoft said last month it would no longer provide general use of an AI-based cloud software feature used to infer people’s emotions. However, despite its own admission that emotion recognition technology creates “risks,” it turns out the company will retain its emotion recognition capability in an app used by people with vision loss.

In fact, amid growing concerns over development and use of controversial emotion recognition in everyday software, both Microsoft and Google continue to incorporate the AI-based features in their products.

“The Seeing AI person channel enables you to recognize people and to get a description of them, including an estimate of their age and also their emotion,” said Saqib Shaikh, a software engineering manager and project lead for Seeing AI at Microsoft who helped build the app, in a tutorial about the product in a 2017 Microsoft video.

Keep Reading Show less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Climate

How I decided to shape Microsoft’s climate agenda

Lucas Joppa went from studying ecology to shaping one of the tech industry’s most robust climate plans. Here’s why — and why CEOs should consider hiring more people like him.

Lucas Joppa, chief environmental officer of Microsoft, told Protocol about the company's plans.

Photo: David Ryder/Bloomberg via Getty Images

Click banner image for more How I decided series

Microsoft has set a number of lofty climate and environmental goals. Forget net zero: It wants to be carbon negative by 2030. Ditto for water.

Keep Reading Show less
Brian Kahn

Brian ( @blkahn) is Protocol's climate editor. Previously, he was the managing editor and founding senior writer at Earther, Gizmodo's climate site, where he covered everything from the weather to Big Oil's influence on politics. He also reported for Climate Central and the Wall Street Journal. In the even more distant past, he led sleigh rides to visit a herd of 7,000 elk and boat tours on the deepest lake in the U.S.

Fintech

There’s a secret hub for fintech talent: Look south

Far from Silicon Valley and Wall Street, Atlanta has long been a hub for payments technology.

Atlanta hasn’t gotten its share of the fintech buzz, perhaps because its founders are less prone to tweetstorming.

Illustration: iStock/Getty Images Plus; Protocol

San Francisco has Square, Stripe and Plaid. But Atlanta has CoreCard, Kabbage and CheckFree. It also lays claim to pioneering charge cards, electronic payments and ATMs. Many of the everyday innovations in fintech we’ve come to rely on have the Atlanta metropolitan area to thank.

Yet Atlanta hasn’t gotten its share of the fintech buzz, perhaps because its founders are less prone to tweetstorming and its products don’t have developers rhapsodizing about APIs. Atlanta’s fintech scene has developed around a stabler, more cautious ethos: less move fast and break things, more stay safe and build things. At a time when fintech valuations have fallen sharply from their lofty peaks and regulators are circling, that may make Atlanta a more favorable place to place fintech bets, whether that means founding a company, investing or hiring local talent.

Keep Reading Show less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

Latest Stories
Bulletins