Enterprise

How the Big Three cloud providers are helping customers manage their energy consumption and carbon emissions

AWS, Microsoft and Google Cloud are offering more tools for enterprises to map, measure and reduce their carbon footprints to meet ESG reporting requirements and answer calls for transparency about environmental impact.

Footprints in a cloud

Cloud providers’ data centers are energy-intensive: How is that measured?

Illustration: Christopher T. Fong/Protocol

As AWS, Microsoft Azure and Google Cloud work toward their carbon-free and net zero carbon emissions goals, they’re also helping their customers understand their own cloud-related carbon footprints and take steps to reduce their impacts.

All three have released tools that, in varying degrees, measure estimated carbon emissions tied to individual customers’ cloud infrastructure and services usage and help them work more sustainably. Enterprises can use those tools to make and track progress toward their carbon-reduction targets and meet environmental, social and corporate governance (ESG) reporting requirements.

Cloud providers’ data centers are energy-intensive, and the electricity used to run them generates greenhouse gas emissions: primarily carbon dioxide, which is tied to global warming.

“Consumers, employees, investors and policymakers are demanding that organizations prioritize sustainability and be transparent about the impact they're having on the environment and the progress they're making on their sustainability initiatives,” Google Cloud CEO Thomas Kurian said during the cloud provider’s inaugural Sustainability Summit last month.

For cloud customers, it comes down to “map, measure, reduce,” said Christopher Wellise, AWS’ director of sustainability. Customers need to map their operational boundaries, use tools to measure the carbon impact and then create targets and strategies for reduction.

“Then it's look for ways to transform their own business — what products are they innovating, what are their customers looking for — and begin to embed sustainability into their innovation practices,” Wellise told Protocol.

Christopher Wellise Christopher Wellise, AWS’ director of sustainabilityPhoto: AWS

It's unclear how last month’s Supreme Court ruling, which limited the Environmental Protection Agency’s ability to regulate emissions from existing coal- and natural gas-fired power plants, will impact enterprises’ plans. But the Securities and Exchange Commission unveiled proposed rule changes in March that would force public companies to make certain climate-related risk disclosures, including their emissions, to provide greater transparency for investors.

Either way, certain large multinational companies and financial institutions doing business or investing capital in Europe still face sustainability requirements under EU rules, even if they’re U.S.-based, according to Elisabeth Brinton, Microsoft’s corporate vice president of sustainability.

“The EU made their jurisdictional authority for sustainability very similar to GDPR and privacy,” Brinton told Protocol. “So the market and where we have to go in terms of enabling not only carbon emissions reductions, but then across ESG more broadly, actually flows through and across to the U.S. companies that are global. It touches down into your cost centers, regardless of where they are.”

Here's a look at how the Big Three cloud providers have been moving toward their carbon goals and helping customers decarbonize their applications and infrastructure, and how other technology companies are jumping into the business.

AWS

Amazon co-founded The Climate Pledge in 2019, committing to achieve net zero carbon emissions across its businesses by 2040, including plans to power its operations with 100% renewable energy.

“We have a 2030 target of reaching 100% renewable energy, but we're actually five years ahead of schedule,” Wellise said.

Amazon bills itself as the world’s largest corporate purchaser of renewable energy. It’s announced more than 310 renewable projects globally, including wind and solar farms, that it says will have the capacity to deliver more than 42,000 gigawatt hours of renewable energy annually – enough to power more than 3.9 million U.S. homes per year.

Enterprises can start to reduce their carbon emissions just by moving their workloads from on-premises data centers to the cloud, according to Wellise.

“There are big benefits, obviously, just moving into cloud primarily, and then there are some things we're doing once you're within cloud to help optimize workloads for customers, which further drives down their carbon footprint,” he said.

On the demand side, AWS designed its own semiconductor chips to run specific workloads and further drive energy efficiencies in its data center infrastructure, Wellise noted. They include its Arm-based AWS Graviton processors. Graviton3-based compute instances use up to 60% less energy for the same performance than comparable instances using Intel or AMD chips, according to AWS.

“We're really achieving huge economies of scale,” Wellise said, pointing to AWS-commissioned 451 Research studies that found AWS’ infrastructure is 3.6 times more energy-efficient than the median of surveyed U.S. enterprise data centers and up to five times more energy-efficient than average data centers in Europe and Asia. “Two-thirds of that is accomplished through our economies of scale and specific hardware design, and the other third of that is driven by our renewable energy programs. What that results in is up to an 80% reduction in carbon footprint associated with our customers’ workloads.”

AWS’ Customer Carbon Footprint Tool, which became generally available in March, allows customers to see the estimated carbon impacts of their AWS workloads down to the service level for its EC2 compute service and S3 storage service. Customers also can get an estimate of the carbon emissions they avoided by using AWS instead of on-premises data centers, a calculation based on the 451 Research report findings.

AWS' Customer Carbon Footprint Tool shows Scope 1 and 2 emissions.Image: AWS

The Customer Carbon Footprint Tool shows AWS’ Scope 1 and Scope 2 emissions associated with a customer’s cloud use from January 2020 onward. Scope 1 emissions come directly from AWS’ operations, such as the energy consumed by its data centers; Scope 2 emissions are indirect emissions from the generation of purchased energy, such as the production of electricity used to power AWS facilities.

The dashboard calculates those emissions monthly, but the data is reported on a three-month delay due to billing cycles of AWS’ electric utilities suppliers. Customers can measure changes in their carbon footprints over time as they deploy new resources on the cloud and review forecasted emissions based on their current usage and AWS’ renewable energy project road map.

The Customer Carbon Footprint Tool, which is available in AWS’ billing console, uses the Greenhouse Gas Protocol accounting standards.

“Whether it's governments, nonprofits, other organizations that are using our services, many of them are involved in either mandatory or voluntary related carbon reporting,” Wellise said. “And if they're a large SaaS provider or somebody that has a large percentage of their footprint tied up in IT, it's really important that they understand what that footprint is.”

But since the tool’s rollout, AWS has been drawing some criticism for its lack of transparency, such as not disclosing its Scope 3 emissions and aggregating emissions data by the broader geographies instead of breaking it down at a cloud-region level. RedMonk analyst James Governor referred to it as a “Version One product,” saying an API would help developers build carbon tracking functionality into their apps or access the emissions data via their preferred command line tools or editors.

“The calculator also doesn’t initially have an easy way to compare and model carbon intensity in different regions — that’s something that we will hopefully see sooner rather than later,” Governor wrote in April. “Instead, the calculator is initially positioned to illustrate the benefits of AWS hosting over self-hosting in your own data centres. Reasonable enough, but the real charm will be when customers can make better decisions about the sustainability of their cloud workloads.”

Wellise acknowledged that customers would like more regional granularity and an API to parse the emissions data on their own. Including Scope 3 emissions and “further definition for regional differences” are on AWS’ road map, according to an AWS spokesperson.

Once customers get their carbon data, the conversation moves to optimization, according to Wellise. In March, AWS added a Sustainability Pillar to its Well-Architected Framework, which provides a set of best practices for designing and operating workloads in the AWS cloud.

“They can actually drive down and architect workloads in a way that they optimize for carbon,” Wellise said.

Microsoft

Rival Microsoft has set a goal to become carbon-negative by 2030. Two years ago, it announced a $1 billion climate innovation fund to spur development of carbon reduction, capture and removal technologies, and Climeworks is among its investments. Microsoft this month signed a 10-year agreement under which Climeworks, which specializes in direct air-capture, will permanently remove 10,000 tons of carbon emissions from the atmosphere on its behalf. And last month, the Microsoft Climate Research Initiative launched with a focus on overcoming constraints to decarbonization, reducing uncertainties in carbon accounting and assessing climate risks in greater detail.

For customers, Microsoft’s Cloud for Sustainability became generally available in June as a set of ESG capabilities from across its cloud portfolio, including Office 365 products such as Excel as well as products and services from partners.

More than 60% of sustainability-related data from global enterprises sits in Excel, according to Brinton.

By pulling together enterprises’ Excel data and edge or IoT data, the Cloud for Sustainability provides an extensible data platform for unified data models and for turning that data into actionable insights that drive “double bottom line of corporate performance, along with actual measurable impact around ESG,” she said.

Elisabeth Brinton, Elisabeth Brinton, Microsoft’s corporate vice president of sustainability, said even U.S. companies face EU climate rules.Photo: Microsoft

Microsoft’s Sustainability Manager app is a baseline tool to help customers get a handle on their Scope 1, 2 and 3 emissions, according to Brinton. It automates data collection, centralizing disparate data into a common format to enable customers to record, monitor, analyze and report their emissions in near real time, and set and track sustainability targets.

“A typical enterprise is going to have well over 100,000 different cost centers, and so being able to pull up and actually report and understand exactly your carbon emissions status by cost center — that's a huge data science challenge,” Brinton said.

Microsoft’s Emissions Impact Dashboard for Azure became generally available last October. The Power BI application lets customers track, report and reduce the carbon emissions associated with their Azure cloud usage. Its dashboard lets customers drill down into Scope 1, 2 and 3 emissions by month, service and data-center region, and enter non-migrated workloads to get estimates of emissions savings from migrating to Azure.

“It helps them with critical insights, helps them make informed, data-driven decisions about their own sustainable computing,” Brinton said. “It is a really, really great tool that gives you that real-time information.”

Microsoft’s Emissions Impact Dashboard for Microsoft 365, which allows customers to track GHG emissions tied to their use of applications including Microsoft Teams and Exchange Online, is in preview.

Microsoft also is continuing to focus on opportunities for sustainable low-code, no-code options, according to Brinton.

“Low-code/no-code is an example of a method that you can actually derive sustainable improvements [from] because you're actually lowering the energy intensity, as it were, of your ability to develop code or compute,” she said.

Google Cloud

Google Cloud, which says it’s been carbon-neutral since 2007, has matched 100% of its electricity consumption with renewable energy since 2017 and maintains it operates the “cleanest cloud.” Its “moonshot” goal is to use carbon-free energy 24/7 in all of its data centers and offices by 2030 — which means it would match its electricity use with carbon-free energy for every hour in every region where it operates — as part of its goal to reach net zero emissions across its operations that year.

Google Cloud’s Carbon Footprint, in preview as of last October, allows customers to measure, report and reduce their carbon emissions by providing the gross carbon emissions associated with the electricity from their Google Cloud Platform usage. Customers can monitor their cloud emissions by product, project and region. Google Cloud is adding Scope 1 and 3 emissions to that reporting data.

“In addition to accounting for our customers' Scope 2 emissions associated with the production of the energy that we use, customers will also be able to access data on the emissions from the sources we control directly, as well as the relevant emissions of Google's Scope 3 apportioned to customer usage,” Justin Keeble, managing director of global sustainability at Google Cloud, told reporters in a briefing last month. “This will give our customers the most comprehensive view possible of the emissions associated with their cloud usage.”

Google Cloud's Carbon Footprint allows customers to measure, report and reduce their carbon emissions.Image: Google Cloud

Customers can export data from Carbon Footprint to Google Cloud’s BigQuery data warehouse to perform analytics and visualizations, in addition to using the data for sustainability reporting requirements. Google Cloud publishes its calculation methodology so auditors and reporting teams can verify that data meets Greenhouse Gas Protocol frameworks for measuring emissions. Non-technical users of Google Cloud, such as sustainability teams, also will be able to access the data for reporting purposes.

Early next year, Google Cloud plans to release Carbon Footprint for Google Workspace (its cloud-based productivity and collaboration tools) so customers can understand emissions associated with products including Gmail and Google Meet and Docs.

Carbon Footprint is part of Google Cloud’s Carbon Sense collection of tools that includes features from products such as Active Assist — its tools for customers to optimize their cloud operations — and Region Picker. Google Cloud added a sustainability category to Active Assist, and its unattended project recommender uses machine learning to estimate the gross carbon emissions that customers can save if they remove abandoned or idle cloud resources.

“In addition to intentionally shortening resource schedules, you can also proactively delete unused VMs, optimize VM shapes, as well as shut down inactive projects,” said Alexandrina Garcia Verdin, a cloud and sustainability developer relations engineer at Google Cloud. “This is where the Active Assist tool really shines, as it proactively suggests carbon-reducing configurations, along with other cost-performance and security-friendly actions.”

One of the most impactful steps a customer can take to reduce cloud-related emissions is using Region Picker to select cloud regions powered by cleaner energy, Keeble said. Google Cloud last year unveiled the carbon characteristics of its cloud regions and icons identifying low-carbon cloud regions so customers can choose “cleaner” ones for their work. Region Picker helps customers compare priorities around lowering emissions versus pricing and latency.

Google Cloud also has introduced low-carbon mode, which lets customers automatically restrict their cloud resources to low-carbon locations across Google Cloud infrastructure with a few clicks.

“Setting defaults can really just simplify the number of priorities put on developers while still ensuring the apps they build run on as low carbon infrastructure as possible,” Kate Brandt, Google’s chief sustainability officer, said during the Sustainability Summit. “For organizations where digital infrastructure is a considerable part of their supply chain footprint, prioritizing sustainable infrastructure … can really make a huge difference.”

Salesforce, a Google Cloud customer that’s been prioritizing low-carbon infrastructure, expects to reduce its yearly gross emissions of certain workloads by roughly 80% with Google Cloud, Brandt said.

Google Cloud is sharing 24/7 carbon-free energy data with customers under a new pilot program announced last month. The information, collected by Google Cloud and its partners over 10 years, includes historical and real-time regional energy grid and carbon data at hourly levels. Customers will be able to see their electricity emissions profile, baseline their carbon-free energy (CFE) score and their Scope 2 emissions footprint from indirect GHG emissions, and forecast and plan for an optimized energy portfolio to achieve its desired CFE score, including by executing carbon-free energy transactions.

The cloud provider last month also rolled out Google Cloud Ready - Sustainability, a new validation program for partners with products and services on Google Cloud that assist customers in achieving sustainability goals, including reducing carbon emissions, increasing the sustainability of their value chains and processing ESG data. The products and services will be available through a new Google Cloud Marketplace Sustainability Hub.

Other efforts

Other companies also are jumping into the mix. Alibaba Cloud last month released Energy Expert, software for customers to manage the carbon emissions of their operations and products. Cloud Carbon Footprint, an open-source project sponsored by Thoughtworks, provides tooling to measure, monitor and reduce cloud carbon emissions, including embodied emissions from manufacturing, and works for multiple cloud providers, including AWS, Microsoft Azure and Google Cloud.

Cirrus Nexus, which has an artificial intelligence-driven cloud management platform, in May launched TrueCarbon, a carbon-reduction tool that currently works for AWS and Microsoft Azure.

“We look at actual consumption,” Cirrus Nexus CEO Chris Noble told Protocol. “We just don't take a database or a virtual machine or some sort of workload and say, ‘OK, this is about how much carbon.’ We actually look at it in five-minute increments. We don't rely on the reporting of the CSP [cloud service provider]. Our interest isn't driving utilization or driving efficiency in their data centers. Our goal is to give our customers a very clear, honest view of how much carbon they're causing to be produced, regardless of what offsets, what carbon credits CSPs buy.”

TrueCarbon uses real-time information from energy production data that’s published on an hourly basis for the U.S., U.K. and EU, according to Noble.

“Every hour, we know what that composition on the energy grid is,” he said. “We know how much of the energy is nuclear, coal, wind, solar. So every five minutes, we look at how much power they're consuming per workload, and then we translate that to how much energy it's consuming off the grid. And we translate how much carbon that's caused to be produced by consuming that energy.”

TrueCarbon also allows customers to automate changes, according to Noble.

“If a company really wanted to get aggressive about it, we can move their workloads from region to region to get the best carbon efficiency,” he said. “Our tool will actually go out and make those changes for them on the fly.”

Cloud providers pour billions of dollars into their data centers and have a vested interest in driving business through them, even if they’re not as environmentally sound as data centers in other cloud regions, Noble said.

“They built data centers where there's … lots of reliance on coal and oil and natural gas,” he said. “They're not going to fold them up tomorrow. We believe things like carbon credits are helpful and they're good, they draw attention, but they don't really solve anything. Carbon offsets like planting trees, you know it’s good, but it doesn’t really change the amount of carbon being produced.”

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep Reading Show less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep Reading Show less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep Reading Show less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep Reading Show less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep Reading Show less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins