Why Google Cloud is providing security for AWS and Azure users too

“To just focus on Google Cloud, we wouldn't be serving our customers,” Google Cloud security chief Phil Venables told Protocol.

Google building

Google Cloud announced the newest addition to its menu of security offerings.

Photo: G/Unsplash

In August, Google Cloud pledged to invest $10 billion over five years in cybersecurity — a target that looks like it will be easily achieved, thanks to the $5.4 billion deal to acquire Mandiant and reported $500 million acquisition of Siemplify in the first few months of 2022 alone.

But the moves raise questions about Google Cloud’s main goal for its security operation. Does Google want to offer the most secure cloud platform in order to inspire more businesses to run on it — or build a major enterprise cybersecurity products and services business, in whatever environment it’s chosen?

According to the cloud provider’s chief information security officer, Phil Venables, Google doesn’t need to pick just one of those goals to focus on.

“To just focus on Google Cloud, we wouldn't be serving our customers. Our customers' reality is a hybrid, multicloud environment,” Venables said in an interview with Protocol. “But as part of serving them there, and working with them, they inevitably move more things to Google Cloud for all of the advantages that we have.”

On Tuesday, Google Cloud announced the newest addition to its menu of security offerings that are available to customers. The Assured Open Source Software service will curate secure open source software packages on behalf of customers.

Ahead of that announcement, Protocol spoke to Venables about open-source security, enterprise security concerns and the talent shortage.

This interview has been edited and condensed for clarity.

With the Assured Open Source Software, I gather that this is about more than just securing customers that are running on Google Cloud?

It is a Google Cloud-delivered product. But we're not just going to do this for things that run on Google Cloud. It could be for any software that enterprises consume into their on-premises systems, or in fact, other clouds.

What we've done at Google for a long time is we don't automatically consume open-source software into our critical systems. We take this open-source software and then we do a whole series of tests, and we find and fix security vulnerabilities before those open-source packages are consumed into our software builds.

So as we saw more organizations, over the past year or so, become increasingly concerned about [the security of] open source, we came up with the idea that we should probably commercialize what we do for ourselves. And thus was born the Assured Open Source Service.

Beyond offering services like this one, how is your security strategy accounting for the talent shortage in cybersecurity?

We recognize the big challenges customers have around cybersecurity skills, and the fact that we need to somehow create a lot more cybersecurity professionals. That's true — but we also need to spend a lot of time thinking about how we 10x the productivity of the cybersecurity professionals we've already got.

A big part of what we're doing with Chronicle and Siemplify and the Security Command Center and VirusTotal, and other things that are coming, is to arrange all those together so that when customers buy and use those services, they're 10x-ing the capability they've got without 10x-ing the number of cybersecurity people they've got. We're very focused on enabling customers to run their security more effectively with the resources they've got.

How would you summarize the security strategy for Google Cloud overall?

We think the fact that we've got this built-in security capability for Google Cloud, rather than something that's been bolted on after the fact, is one of our key strengths. Our whole approach to default security across the platform is important. Secondly, we're very focused on how we can bring all of these tools together to enable customers to manage all of their security — not just on Google Cloud. It helps customers across all of their environments.

This is driving a lot of the investments you see us doing with things like Chronicle, Siemplify, VirusTotal, BeyondCorp Enterprise. You can see how Mandiant, assuming that acquisition closes, will be a key part of that story about how we help customers manage all their security, not just their security on Google Cloud.

If your goal is to grow the use of Google Cloud, why provide security that enables customers to run elsewhere?

We recognize that while we have some customers that run everything on Google Cloud, there are lots of customers that still run on-premises, and run in multiple clouds. Modern businesses have been built up over many years, and have quite complex IT environments. For us to not recognize and not help them with that reality, I think, is not the greatest thing for the customers. So a lot of our security tooling is capable of ingesting content from on-premise environments and other clouds. We're very focused on the reality that our big customers have.

We think if we keep doing that, customers will be better off. And ultimately, they'll want to run things more on Google Cloud. But we're certainly going to support them everywhere.

So you think that the fact that you’re working to serve customers wherever they are on security, that could be an entry point for them with Google Cloud?

I think that's right. I think when customers have the experience of not just the security products we provide, but the base level of security and capability of the platform, they see a lot of advantage in moving across to us. But to get going with that, we have to work with them where they are.


Why foundation models in AI need to be released responsibly

Foundation models like GPT-3 and DALL-E are changing AI forever. We urgently need to develop community norms that guarantee research access and help guide the future of AI responsibly.

Releasing new foundation models doesn’t have to be an all or nothing proposition.

Illustration: sorbetto/DigitalVision Vectors

Percy Liang is director of the Center for Research on Foundation Models, a faculty affiliate at the Stanford Institute for Human-Centered AI and an associate professor of Computer Science at Stanford University.

Humans are not very good at forecasting the future, especially when it comes to technology.

Keep Reading Show less
Percy Liang
Percy Liang is Director of the Center for Research on Foundation Models, a Faculty Affiliate at the Stanford Institute for Human-Centered AI, and an Associate Professor of Computer Science at Stanford University.

Every day, millions of us press the “order” button on our favorite coffee store's mobile application: Our chosen brew will be on the counter when we arrive. It’s a personalized, seamless experience that we have all come to expect. What we don’t know is what’s happening behind the scenes. The mobile application is sourcing data from a database that stores information about each customer and what their favorite coffee drinks are. It is also leveraging event-streaming data in real time to ensure the ingredients for your personal coffee are in supply at your local store.

Applications like this power our daily lives, and if they can’t access massive amounts of data stored in a database as well as stream data “in motion” instantaneously, you — and millions of customers — won’t have these in-the-moment experiences.

Keep Reading Show less
Jennifer Goforth Gregory
Jennifer Goforth Gregory has worked in the B2B technology industry for over 20 years. As a freelance writer she writes for top technology brands, including IBM, HPE, Adobe, AT&T, Verizon, Epson, Oracle, Intel and Square. She specializes in a wide range of technology, such as AI, IoT, cloud, cybersecurity, and CX. Jennifer also wrote a bestselling book The Freelance Content Marketing Writer to help other writers launch a high earning freelance business.

The West’s drought could bring about a data center reckoning

When it comes to water use, data centers are the tech industry’s secret water hogs — and they could soon come under increased scrutiny.

Lake Mead, North America's largest artificial reservoir, has dropped to about 1,052 feet above sea level, the lowest it's been since being filled in 1937.

Photo: Mario Tama/Getty Images

The West is parched, and getting more so by the day. Lake Mead — the country’s largest reservoir — is nearing “dead pool” levels, meaning it may soon be too low to flow downstream. The entirety of the Four Corners plus California is mired in megadrought.

Amid this desiccation, hundreds of the country’s data centers use vast amounts of water to hum along. Dozens cluster around major metro centers, including those with mandatory or voluntary water restrictions in place to curtail residential and agricultural use.

Keep Reading Show less
Lisa Martine Jenkins

Lisa Martine Jenkins is a senior reporter at Protocol covering climate. Lisa previously wrote for Morning Consult, Chemical Watch and the Associated Press. Lisa is currently based in Brooklyn, and is originally from the Bay Area. Find her on Twitter ( @l_m_j_) or reach out via email (ljenkins@protocol.com).


Indeed is hiring 4,000 workers despite industry layoffs

Indeed’s new CPO, Priscilla Koranteng, spoke to Protocol about her first 100 days in the role and the changing nature of HR.

"[Y]ou are serving the people. And everything that's happening around us in the world is … impacting their professional lives."

Image: Protocol

Priscilla Koranteng's plans are ambitious. Koranteng, who was appointed chief people officer of Indeed in June, has already enhanced the company’s abortion travel policies and reinforced its goal to hire 4,000 people in 2022.

She’s joined the HR tech company in a time when many other tech companies are enacting layoffs and cutbacks, but said she sees this precarious time as an opportunity for growth companies to really get ahead. Koranteng, who comes from an HR and diversity VP role at Kellogg, is working on embedding her hybrid set of expertise in her new role at Indeed.

Keep Reading Show less
Amber Burton

Amber Burton (@amberbburton) is a reporter at Protocol. Previously, she covered personal finance and diversity in business at The Wall Street Journal. She earned an M.S. in Strategic Communications from Columbia University and B.A. in English and Journalism from Wake Forest University. She lives in North Carolina.


New Jersey could become an ocean energy hub

A first-in-the-nation bill would support wave and tidal energy as a way to meet the Garden State's climate goals.

Technological challenges mean wave and tidal power remain generally more expensive than their other renewable counterparts. But government support could help spur more innovation that brings down cost.

Photo: Jeremy Bishop via Unsplash

Move over, solar and wind. There’s a new kid on the renewable energy block: waves and tides.

Harnessing the ocean’s power is still in its early stages, but the industry is poised for a big legislative boost, with the potential for real investment down the line.

Keep Reading Show less
Lisa Martine Jenkins

Lisa Martine Jenkins is a senior reporter at Protocol covering climate. Lisa previously wrote for Morning Consult, Chemical Watch and the Associated Press. Lisa is currently based in Brooklyn, and is originally from the Bay Area. Find her on Twitter ( @l_m_j_) or reach out via email (ljenkins@protocol.com).

Latest Stories