How Google Cloud plans to kill its ‘Killed By Google’ reputation

Under the new Google Enterprise APIs policy, the company is making a promise that its services will remain available and stable far into the future.

Thomas Kurian, CEO of Google Cloud, speaks at Google Cloud Next '19 in San Francisco.

Google Cloud CEO Thomas Kurian has promised to make the company more customer-friendly.

Photo: Michael Short/Bloomberg via Getty Images 2019

Google Cloud issued a promise Monday to current and potential customers that it's safe to build a business around its core technologies, another step in its transformation from an engineering playground to a true enterprise tech vendor.

Starting Monday, Google will designate a subset of APIs across the company as Google Enterprise APIs, including APIs from Google Cloud, Google Workspace and Google Maps. APIs selected for this category — which will include "a majority" of Google Cloud APIs according to Kripa Krishnan, vice president at Google Cloud — will be subject to strict guidelines regarding any changes that could affect customer software built around those APIs.

"It is built on the principle that no feature may be removed or changed in a way that is backwards incompatible for as long as customers are actively using it," Krishnan said. "If a deprecation or breaking change of an API is unavoidable, then we are saying that the burden is on us to make the experience as effortless and painless as possible to our customers."

The announcement is clear recognition of widespread feedback from Google Cloud customers and outright derision in several corners of the internet regarding Google's historic reputation for ending support for its APIs without sufficient notice or foresight. The canonical example was probably the company's decision to shutter Google Reader in 2013 with just a couple of months' notice, which led to a torrent of criticism that persists today.

But while it's one thing to discontinue free consumer-facing services like Reader that Google thinks aren't used widely enough to justify ongoing support, it's quite another to adopt that stance with paying business customers. Even if they're one of only a few customers using a particular service, cloud customers need to know that service will be available and stable far into the future.

"We're striving to leave no dead ends in our products and leave no customer behind, even if this adds significant costs to us," Krishnan said.

Chopping block

When asked if she was familiar with the "Killed By Google" website and Twitter account, run by Cody Ogden as a satirical take on Google's reputation for stability, Krishnan couldn't help but laugh.

"It was pretty apparent to us from many sources on the internet that we were not doing well," she allowed.

Over the last several years, Google Cloud has been trying to shed a well-earned reputation as an engineering-driven organization that considered itself the foremost authority on web-scale infrastructure computing, regardless of what its customers actually wanted to do with its tools. That mindset — bordering on arrogance — really stood out against competitors like AWS, which won the trust of developers and CIOs with its early commitment to cloud customers, and Microsoft, which has nurtured business relationships with nearly every company on the planet over the last several decades.

This mentality began to change in early 2019 after CEO Thomas Kurian was brought in from Oracle to teach Google Cloud how to be an enterprise tech vendor. Kurian hired legions of enterprise salespeople to develop closer relationships with cloud buyers, and also began to steer Google Cloud's product-development culture into a more humble posture.

"Pride is a trap for the unwary, and it has ensnared many a Google team into thinking that their decisions are always right, and that correctness (by some vague fuzzy definition) is more important than customer focus," wrote Steve Yegge, a former software engineer at both Google and Amazon, in an epic post last August excoriating Google's approach to supporting its tools.

Google Cloud has heard that feedback loud and clear, Krishnan said.

"It was not that we didn't have [a deprecation] policy before, it just didn't work for us at scale. It worked much better when you were small, and you have contained customer units or users that you interact with daily," she said. "It absolutely did not work at the scale of cloud, so we had to rethink it."

Under the new Google Enterprise API policy, the company is promising that it won't kill or alter APIs that are being "actively used" by its customers, although it's not exactly clear how "active use" is defined. Should Google decide it needs to deprecate or make a change that will force customers to make substantial alterations to their own software, it will give at least one year's notice of the impending change.

Safe for business

The new program should remove some objections that cloud buyers might have had about Google, but the frequency at which Google makes changes to its APIs under this program will be scrutinized against similar decisions at AWS and Microsoft. Industry watchers believe the two leading cloud providers have made far fewer changes to their services over the past several years compared to Google.

Cloud infrastructure computing is in the late-majority phase of the adoption cycle, and the companies that frantically purchased cloud services amid the pandemic last year are companies that tend to be more risk averse than cloud early adopters. The new API policy will also give current Google Cloud customers a little more assurance that they won't have to repeat all the work it took to move to the cloud a few years down the road if Google decided it no longer wanted to support a service that was critically important to their business.

"These tenets are a much deeper construct that really strikes at the root of how we do work in Google Cloud," Krishnan said. "It's really a shift in the mindset of the organization as we pivot more and more towards doing right by our customers."

More details on the Google Enterprise API policy are available here.

Correction: An earlier version of this story misspelled Cody Ogden's name. This story was updated on July 26, 2021.


The West’s drought could bring about a data center reckoning

When it comes to water use, data centers are the tech industry’s secret water hogs — and they could soon come under increased scrutiny.

Lake Mead, North America's largest artificial reservoir, has dropped to about 1,052 feet above sea level, the lowest it's been since being filled in 1937.

Photo: Mario Tama/Getty Images

The West is parched, and getting more so by the day. Lake Mead — the country’s largest reservoir — is nearing “dead pool” levels, meaning it may soon be too low to flow downstream. The entirety of the Four Corners plus California is mired in megadrought.

Amid this desiccation, hundreds of the country’s data centers use vast amounts of water to hum along. Dozens cluster around major metro centers, including those with mandatory or voluntary water restrictions in place to curtail residential and agricultural use.

Keep Reading Show less
Lisa Martine Jenkins

Lisa Martine Jenkins is a senior reporter at Protocol covering climate. Lisa previously wrote for Morning Consult, Chemical Watch and the Associated Press. Lisa is currently based in Brooklyn, and is originally from the Bay Area. Find her on Twitter ( @l_m_j_) or reach out via email (ljenkins@protocol.com).

Every day, millions of us press the “order” button on our favorite coffee store's mobile application: Our chosen brew will be on the counter when we arrive. It’s a personalized, seamless experience that we have all come to expect. What we don’t know is what’s happening behind the scenes. The mobile application is sourcing data from a database that stores information about each customer and what their favorite coffee drinks are. It is also leveraging event-streaming data in real time to ensure the ingredients for your personal coffee are in supply at your local store.

Applications like this power our daily lives, and if they can’t access massive amounts of data stored in a database as well as stream data “in motion” instantaneously, you — and millions of customers — won’t have these in-the-moment experiences.

Keep Reading Show less
Jennifer Goforth Gregory
Jennifer Goforth Gregory has worked in the B2B technology industry for over 20 years. As a freelance writer she writes for top technology brands, including IBM, HPE, Adobe, AT&T, Verizon, Epson, Oracle, Intel and Square. She specializes in a wide range of technology, such as AI, IoT, cloud, cybersecurity, and CX. Jennifer also wrote a bestselling book The Freelance Content Marketing Writer to help other writers launch a high earning freelance business.

Indeed is hiring 4,000 workers despite industry layoffs

Indeed’s new CPO, Priscilla Koranteng, spoke to Protocol about her first 100 days in the role and the changing nature of HR.

"[Y]ou are serving the people. And everything that's happening around us in the world is … impacting their professional lives."

Image: Protocol

Priscilla Koranteng's plans are ambitious. Koranteng, who was appointed chief people officer of Indeed in June, has already enhanced the company’s abortion travel policies and reinforced its goal to hire 4,000 people in 2022.

She’s joined the HR tech company in a time when many other tech companies are enacting layoffs and cutbacks, but said she sees this precarious time as an opportunity for growth companies to really get ahead. Koranteng, who comes from an HR and diversity VP role at Kellogg, is working on embedding her hybrid set of expertise in her new role at Indeed.

Keep Reading Show less
Amber Burton

Amber Burton (@amberbburton) is a reporter at Protocol. Previously, she covered personal finance and diversity in business at The Wall Street Journal. She earned an M.S. in Strategic Communications from Columbia University and B.A. in English and Journalism from Wake Forest University. She lives in North Carolina.


New Jersey could become an ocean energy hub

A first-in-the-nation bill would support wave and tidal energy as a way to meet the Garden State's climate goals.

Technological challenges mean wave and tidal power remain generally more expensive than their other renewable counterparts. But government support could help spur more innovation that brings down cost.

Photo: Jeremy Bishop via Unsplash

Move over, solar and wind. There’s a new kid on the renewable energy block: waves and tides.

Harnessing the ocean’s power is still in its early stages, but the industry is poised for a big legislative boost, with the potential for real investment down the line.

Keep Reading Show less
Lisa Martine Jenkins

Lisa Martine Jenkins is a senior reporter at Protocol covering climate. Lisa previously wrote for Morning Consult, Chemical Watch and the Associated Press. Lisa is currently based in Brooklyn, and is originally from the Bay Area. Find her on Twitter ( @l_m_j_) or reach out via email (ljenkins@protocol.com).


Watch 'Stranger Things,' play Neon White and more weekend recs

Don’t know what to do this weekend? We’ve got you covered.

Here are our picks for your long weekend.

Image: Annapurna Interactive; Wizard of the Coast; Netflix

Kick off your long weekend with an extra-long two-part “Stranger Things” finale; a deep dive into the deckbuilding games like Magic: The Gathering; and Neon White, which mashes up several genres, including a dating sim.

Keep Reading Show less
Nick Statt

Nick Statt is Protocol's video game reporter. Prior to joining Protocol, he was news editor at The Verge covering the gaming industry, mobile apps and antitrust out of San Francisco, in addition to managing coverage of Silicon Valley tech giants and startups. He now resides in Rochester, New York, home of the garbage plate and, completely coincidentally, the World Video Game Hall of Fame. He can be reached at nstatt@protocol.com.

Latest Stories