Cloud spending is hard to control. Cloud providers only do so much to help.

Customers must take initiative and use third-party software to supplement cloud vendors’ tools to optimize their cloud usage and spending.

Clouds with an obscured price

Cloud computing isn't just a rental server market; it's an entirely new way of building and maintaining applications.

Image: Christopher T. Fong/Protocol

Click banner image for more Subscription Week 2022 coverage

As cloud computing has become mainstream, gaining visibility into cloud costs and controlling them has become a perennial issue for enterprises. And the cost management tools offered by the cloud providers don’t get the job done.

Cloud costs continue to grow, and the amount of wasteful spending remains high, according to the Flexera 2022 State of the Cloud Report survey released in March. For the sixth straight year, optimizing their existing cloud use to realize cost savings — sometimes known as FinOps — was the No. 1 cloud priority this year for enterprise technical and business professionals. Better financial reporting on cloud costs also was a top initiative.

FinOps and this whole optimization and cost management are a critical part of the cloud operating model that customers have to put in when they're moving into the cloud estates,” said Chris Wegmann, managing director of Accenture’s AWS business group technology and practice. “[It’s] usually a function that gets skipped or missed or is not invested in as much.”

Deploying FinOps encourages data-driven management of cloud spending while simultaneously increasing efficiency and typically reduces cloud spending by as much as 20% to 30%, according to Accenture.

While cloud vendors provide some tools and support through technical account managers, customers must play a very active role, according to Wegmann. That includes knowing what their workloads need and making the necessary changes to optimize them, or risk paying more than they should.

“It is a shared responsibility, because … AWS can't make those changes on your behalf,” Wegmann said. “A lot of customers, in their on-premises environments, don't have that visibility, so they don't know where the spend is. But that's a big advantage in cloud.”

It’s all about data

The cloud providers offer basic cost-management services, but they typically must be augmented by third-party software and require cloud users to do some of the heavy lifting.

“Data is absolutely critical for cloud cost management, optimization, and that usually comes down to tagging,” Wegmann said. “That means you tag every piece of your cloud estate and then [make] sure that data model brings that all together. The tools themselves don't automatically create that data model or that mapping, and it's something that has to be done.”

AWS’ cloud cost management product suite includes AWS Cost Explorer to help customers visualize and manage their AWS costs and usage across all accounts; AWS Budgets to improve planning and cost control with budgeting and forecasting; and AWS Cost Anomaly Detection, a machine-learning model that learns customers’ historic spending patterns to detect one-time cost spikes and continuous cost increases. AWS Trusted Advisor inspects a customer’s AWS infrastructure and provides best-practice recommendations when there are opportunities to save money and optimize system availability, performance and security.

If you know what kind of computing performance your application requires, AWS has more than 475 instance types that customers can choose from to give them the best price-performance scenario, noted Chris Grusz, AWS’ director of Business Development, AWS Marketplace, Service Catalog and Control Tower.

“We have over 15 different database types … so you can pick the exact database that is going to suit what you're actually looking for and then run that on the exact instance type that you need,” Grusz said. “And then, of course, we're always looking to reduce that price wherever we can. [AWS] Marketplace also integrates into a number of our partner solutions, so that if a customer wants to view third-party spend for SaaS applications alongside their AWS consumption, we also have ways to feed into a lot of the popular cost-management products that are on the market.”

Accenture itself has a very large AWS presence that it’s constantly looking to optimize, and it employs the same internal tool that it uses with customers — the Accenture myNav platform — to optimize its cloud architecture, track and manage consumption and adjust spending accordingly, according to Wegmann.

“We have tens of millions of lines of billing data that come in; that means tens of millions of different services running,” Wegmann said. “We aggregate that and bring that to 1,000 or so different teams. Through that, we're constantly looking at optimization, ways to save money, turn stuff off when it's not being used, move from expensive instance types to lower cost. Sometimes it's cheaper storage or just changing … to more serverless, which is usually cheaper than having servers just sit around to process things.”

Accenture and almost all its AWS customers use a combination of native AWS and third-party tools such as CloudHealth by VMware or Cloudability to help monitor and control their cloud costs, as there’s not a single tool that can do it all, Wegmann said.

“Unfortunately, it takes two or three different tools, as well as what comes from Amazon, to do that,” Wegmann said. “We spend a lot of time with our customers, helping them make those decisions. Some of them do assessments or do free trials [of third-party software] or things like that to understand the capabilities and understand where it fits into their gaps.”

Help me help you

Helping customers control their cloud costs comes down to maintaining long-term relationships, according to Amit Zavery, general manager and head of platform for Google Cloud.

“If they’re successful, we're successful,” said Zavery, who manages Google Cloud’s commerce, billing and monetizer product team. “If they have budget constraints or budget requirements, or they want to have a future plan around their usage, we give them the controls. Otherwise it would be very difficult for them to scale and eventually use a lot more services.”

Zavery’s team builds the cost-control and cost-management capabilities provided to customers through the Google Cloud Console, which incorporates customers’ cloud usage and historical data.

“We've been trying to make sure that we provide as many tools and capabilities so that customers can look at their usage, manage the cost and optimize their capability usage as well so that they can have a better outcome using our products from the economic perspective,” Zavery said. “That data also is available to them to extract and put it in any kind of analysis tool. We have value advisers inside Google Cloud as well who can work very closely with customers.”

Google Cloud in March announced a series of price hikes for its infrastructure services that will take effect Oct. 1. It’s adding new data replication fees and network egress charges, and doubling prices for some services, such as coldline storage operations.

“We just brought our pricing somewhat to the industry standards versus anything else,” Zavery said. “We, of course, have been working very closely with the customers anytime we make any pricing changes, and we also try to provide insights to our customers into what their bills will look like.”

Microsoft declined to discuss how it helps customers monitor and control their cloud costs.

“Tracking cloud usage and associated costs should be a transparent process for any customer using the cloud,” a Microsoft spokesperson said in a statement. “That’s why Microsoft offers a suite of services, like Azure Cost Management and Billing, to ensure customers have the tools to accurately forecast and manage their cloud usage.”

“Intractable” problem

While the Big Three cloud providers say they help customers or make it easy for customers to monitor and control their costs, Corey Quinn begs to differ.

“All three of them will tell you they absolutely do, but they do not,” said Quinn, chief cloud economist for The Duckbill Group, which helps AWS customers manage their cloud costs. “They offer different tools, different points of visibility and the rest, but part of the problem is that they are distant enough from their customers that they do not fully understand the scope of the problem. And on some level, let's be clear, this is an intractable problem to have to solve.”

The cloud providers’ convoluted pricing structures effectively have become inscrutable, according to Quinn.

“For virtually every customer out there, the cloud bill every month is what the provider tells them it is, because who's going to have the energy even to dissect it, let alone argue,” he said.

The complexity of understanding and controlling cloud costs is driven by what an organization looks like — how large it is, how it interfaces across functions — and its technology stack.

“The more elastic and cloudy and higher-level services that a cloud provider has that you use, the less predictable the spend becomes,” Quinn said. “Yes, it becomes a lot more efficient, but also a hell of a lot harder to predict.”

Quinn usually starts customer cloud cost conversations with a series of questions: Is their cloud bill accurate? Is there a bunch of stuff that should be turned off? Are there misconfigurations in an engineering sense that relatively small changes would have a dramatic impact on? Do they have contract negotiations coming up?

“All the providers offer significant discounting for committed spend, but that commitment is measured in multiple years to a point where it's not just ‘predict your cloud bill next month,’ it's ‘predict what your spend is going to look like on what curve for the next half-a-decade,’” Quinn said. “That is an incredibly daunting task because you're about to sign on the dotted line, but it figures out what that's going to look like.”

“One thing that's often a canard is that people think, ‘Ah, I'm going to move to the cloud, because it'll save me money,’” Quinn said. “I've done the deep-dive TCO [total cost of ownership] analysis, and I am left with the conclusion that if you're doing this route to the cloud from your data center to save money, you're almost certainly wrong.”

After all, cloud computing isn't just a rental server market; it's an entirely new way of building and maintaining applications.

“You will not realize an appreciable cost savings for at least five years — arguably ever, depending on what you do,” Quinn said. “So the reason to do it instead is because of a capability story — it lets you move faster as a company.”


To clear the FTC, Microsoft’s Activision deal might require compromise

The FTC is in the process of reviewing the biggest-ever gaming acquisition. Here’s how it could change the Xbox business.

Will the Microsoft acquisition of Activision get through the FTC?

Image: Microsoft; Protocol

Microsoft’s planned acquisition of Activision Blizzard is the largest-ever deal in the video game market by a mile. With a sale price of $68.7 billion, the deal is nearly 450% larger than Grand Theft Auto publisher Take-Two Interactive’s acquisition of Zynga in January, the next-largest game acquisition ever recorded.

The eye-popping price underlines the scale and scope of Microsoft’s ambitions for its gaming business: If the deal is approved, Microsoft would own — alongside its current major properties, such as Halo and Minecraft — Warcraft, Overwatch and Call of Duty, to name just a few. In turn, the deal has invited a rare level of scrutiny and attention from lawmakers and policy professionals now turning their sights on an industry that’s flown under the regulatory radar for the last several decades of its existence.

Keep Reading Show less
Nick Statt

Nick Statt is Protocol's video game reporter. Prior to joining Protocol, he was news editor at The Verge covering the gaming industry, mobile apps and antitrust out of San Francisco, in addition to managing coverage of Silicon Valley tech giants and startups. He now resides in Rochester, New York, home of the garbage plate and, completely coincidentally, the World Video Game Hall of Fame. He can be reached at nstatt@protocol.com.

Sponsored Content

Why the digital transformation of industries is creating a more sustainable future

Qualcomm’s chief sustainability officer Angela Baker on how companies can view going “digital” as a way not only toward growth, as laid out in a recent report, but also toward establishing and meeting environmental, social and governance goals.

Three letters dominate business practice at present: ESG, or environmental, social and governance goals. The number of mentions of the environment in financial earnings has doubled in the last five years, according to GlobalData: 600,000 companies mentioned the term in their annual or quarterly results last year.

But meeting those ESG goals can be a challenge — one that businesses can’t and shouldn’t take lightly. Ahead of an exclusive fireside chat at Davos, Angela Baker, chief sustainability officer at Qualcomm, sat down with Protocol to speak about how best to achieve those targets and how Qualcomm thinks about its own sustainability strategy, net zero commitment, other ESG targets and more.

Keep Reading Show less
Chris Stokel-Walker

Chris Stokel-Walker is a freelance technology and culture journalist and author of "YouTubers: How YouTube Shook Up TV and Created a New Generation of Stars." His work has been published in The New York Times, The Guardian and Wired.


Okta CEO: 'We should have done a better job' with the Lapsus$ breach

In an interview with Protocol, Okta CEO Todd McKinnon said the cybersecurity firm could’ve done a lot of things better after the Lapsus$ breach of a third-party support provider earlier this year.

From talking to hundreds of customers, “I've had a good sense of the sentiment and the frustrations,” McKinnon said.

Photo: David Paul Morris via Getty Images

Okta co-founder and CEO Todd McKinnon agrees with you: Disclosing a breach that impacts customer data should not take months.

“If that happens in January, customers can't be finding out about it in March,” McKinnon said in an interview with Protocol.

Keep Reading Show less
Kyle Alspach

Kyle Alspach ( @KyleAlspach) is a senior reporter at Protocol, focused on cybersecurity. He has covered the tech industry since 2010 for outlets including VentureBeat, CRN and the Boston Globe. He lives in Portland, Oregon, and can be reached at kalspach@procotol.com.


Ethereum's co-founder thinks the blockchain can fix social media

But before the blockchain can fix social media, someone has to fix the blockchain. Frank McCourt, who’s put serious money behind his vision of a decentralized social media future, thinks Gavin Wood may be the key.

Gavin Wood, co-founder of Ethereum and creator of Polkadot, is helping Frank McCourt's decentralized social media initiative.

Photo: Jason Crowley

Frank McCourt, the billionaire mogul who is donating $100 million to help build decentralized alternatives to the social media giants, has picked a partner to make the blockchain work at Facebook scale: Ethereum co-founder Gavin Wood.

McCourt’s Project Liberty will work with the Web3 Foundation’s Polkadot project, it said Tuesday. Wood launched Polkadot in 2020 after leaving Ethereum. Project Liberty has a technical proposal to allow users to retain their data on a blockchain as they move among future social media services. Wood’s involvement is to give the idea a shot at actually working at the size and speed of a popular social network.

Keep Reading Show less
Ben Brody

Ben Brody (@ BenBrodyDC) is a senior reporter at Protocol focusing on how Congress, courts and agencies affect the online world we live in. He formerly covered tech policy and lobbying (including antitrust, Section 230 and privacy) at Bloomberg News, where he previously reported on the influence industry, government ethics and the 2016 presidential election. Before that, Ben covered business news at CNNMoney and AdAge, and all manner of stories in and around New York. He still loves appearing on the New York news radio he grew up with.


Gensler: Bitcoin may be a commodity

The SEC has been vague about crypto. But Gensler said bitcoin is a commodity, “maybe.” It’s the clearest glimpse of his views on digital assets yet.

“Bitcoin — maybe that’s a commodity token. That has a big market value, but that goes over there,” Gensler said, referring to another regulator, the CFTC.

Photoillustration: Al Drago/Bloomberg via Getty Images; Protocol

SEC Chair Gary Gensler has long argued that many cryptocurrencies are subject to regulation as securities.

But he recently clarified that this view wouldn’t apply to the best-known cryptocurrency, bitcoin.

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.

Latest Stories