Climate

The West’s drought could bring about a data center reckoning

When it comes to water use, data centers are the tech industry’s secret water hogs — and they could soon come under increased scrutiny.

A danger sign about low water levels at Lake Mead and a dry boat ramp. The lake's water is far in the background and bathtub rings that mark previous water levels are visible.

Lake Mead, the United States' largest artificial reservoir, has dropped to about 1,052 feet above sea level, the lowest it's been since being filled in 1937.

Photo: Mario Tama/Getty Images

The West is parched, and getting more so by the day. Lake Mead — the country’s largest reservoir — is nearing “dead pool” levels, meaning it may soon be too low to flow downstream. The entirety of the Four Corners plus California is mired in megadrought.

Amid this desiccation, hundreds of the country’s data centers use vast amounts of water to hum along. Dozens cluster around major metro centers, including those with mandatory or voluntary water restrictions in place to curtail residential and agricultural use.

Exactly how much water, however, is an open question given that many companies don’t track it, much less report it. While their energy use and accompanying emissions have made more headlines, data centers’ water usage is coming under increasing scrutiny. And as climate change makes water more scarce, pressure could grow on hyperscale data centers to disclose their water use and factor scarcity into where and how they operate.

Centers consume water both directly (for liquid cooling) and indirectly (for non-renewable electricity generation). Roughly one-fifth of the data center servers in the U.S. source water directly from moderately to highly water-stressed watersheds, according to a 2021 analysis published in Environmental Research Letters.

“There’s a trade-off between energy efficiency and water use” in terms of how a company cools a data center, said Arman Shehabi, a research scientist at Lawrence Berkeley Lab and one of the study’s authors. He said water availability is likely low on the priority list when companies are deciding where to build compared to the price of electricity.

Drought, and specifically the ongoing megadrought in the West, is “pretty new on the radar” of these companies, said Shehabi. In recent decades, many data center companies moved toward using water rather than electricity to keep servers cool in a bid to keep energy use and emissions down, he added. But it is becoming increasingly clear that water is a potentially scarce resource in many regions as well.

The megadrought gripping the West is the worst since at least 800 A.D., and climate change is increasing the odds of more dry weather to come. Other parts of the world are also increasingly water-stressed due to the impact of climate change as well as people relying more heavily on groundwater for everything from drinking to agriculture.

While these other uses tend to be meticulously measured, data center water use is much more opaque, a fact that has already led to tension on the local level between companies and communities that have been asked to cut their use in times of drought. The Uptime Institute, which advises the information technology sector on improving infrastructure performance and efficiency, found in its latest Data Center Industry Survey that just 51% of the respondents measure water use in some way, even as energy use and emissions tracking becomes standard fare.

“There is very limited data available on data center water consumption,” said David Mytton, who authored a 2021 NPJ Clean Water article on the subject. As compared with energy use, he continued, “water consumption is much more behind-the-scenes, is much more controversial, and in some cases is considered, or has been considered in the past, a trade secret.”

Few hyperscale data center owners publicize their total water use data, and it’s even harder to find information on specific data centers. Still, there are some signs of progress as companies increasingly consider not only how their operations impact the climate, but also the reverse.

Microsoft has begun sharing its aggregate consumption in its annual sustainability reports. The 2021 installment shows that the company’s appetite for water has grown steadily over the last five years, from roughly 67.5 million cubic feet in 2017 to just over 158 million cubic feet in 2021. The company has increased the volume of water it replenishes in that period as well, from none in 2017 to roughly 71 million cubic feet in 2021. The company aims to increase replenishment to the point where it outstrips its water consumption by 2030, and reduce data center water waste by 95% by 2024.

Noelle Walsh, corporate vice president of Microsoft’s Cloud Operations and Innovation team, said the company is looking into “new, waterless cooling solutions, such as two-phase liquid immersion cooling” to help reach these goals.

Microsoft’s main competitors — AWS and Google — are at least tracking water data for internal use. Google has also set a water stewardship pledge, saying it will replenish 120% of the water it consumes by 2030. Neither of the two, however, responded on the record to questions from Protocol about how much water their data centers consume.

And while AWS has not committed to a certain timeline, it has stated publicly that it is working to use water more efficiently and use less potable water to cool its data centers. A spokesperson said that while the company tracks its own water use, it has not had to cut back as a consequence of the West’s drought.

Arno van Gennip — the vice president of Operations Engineering at Equinix, a digital infrastructure company with more than 240 data centers globally — said that measuring water is still a fairly new phenomenon because, quite simply, water has historically been both abundant and cheap.

However, measuring water use is an increasingly crucial part of making data centers more efficient. “If you don’t measure, you cannot manage,” van Gennip said.

While many of Equinix’s centers were already tracking use, the company began doing so across the board last year. It is in the process of analyzing that data for accuracy before publicly sharing it, van Gennip said.

Water usage effectiveness has emerged as a metric used to gauge efficiency, and is calculated by dividing the total liters of water used for humidification and cooling by the total annual amount of power used by the center. While water usage effectiveness was inspired by the now-widespread metric of power usage effectiveness, Shehabi said it has yet to catch on in the same way.

“You’ve got to have some sort of efficiency metric before anyone starts working to improve on it. And that’s happening with energy use, at least on the cooling side … but we haven’t quite seen that with water,” he said.

But change might be afoot, at least when it comes to factoring drought into data centers’ plans. Van Gennip said that Equinix began considering the subject early in 2021, and is looking into contingency plans for the impacts of both drought and flood.

He predicted that new data centers will consider water access down the road as resources become increasingly precious. “This is one of the areas where we need to be prepared,” van Gennip said.

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep ReadingShow less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep ReadingShow less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep ReadingShow less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep ReadingShow less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep ReadingShow less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins