Enterprise

Google Cloud just built a data lakehouse on BigQuery

BigLake, a new data lake storage engine that resembles data lakehouses built by newer data companies, will be at the center of Google Cloud’s data platform strategy.

Sudhir Hasbe, Google Cloud’s senior director of Product Management for data analytics

BigLake allows enterprises to unify their data warehouses and data lakes to analyze data without worrying about the underlying storage format or systems, according to Sudhir Hasbe, Google Cloud’s senior director of Product Management for data analytics.

Photo: Google Cloud

Google Cloud plans to launch a new data lake storage engine based on its popular BigQuery data warehouse to help remove barriers preventing customers from mining the full value of their ever-increasing data.

BigLake, now available in preview, allows enterprises to unify their data warehouses and data lakes to analyze data without worrying about the underlying storage format or systems, according to Sudhir Hasbe, Google Cloud’s senior director of Product Management for data analytics.

“The biggest advantage is then you don't have to duplicate your data across two different environments and create data silos,” Hasbe said in a press briefing prior to Wednesday’s Google Data Cloud Summit, where BigLake is being announced.

With BigLake, Google Cloud is extending the capabilities of its 11-year-old BigQuery to data lakes on Google Cloud Storage to enable a flexible, open lakehouse architecture , according to the cloud provider. A data lakehouse is an open data-management architecture that combines data-warehouse-like data management and optimization functions, including business intelligence, machine learning and governance, for data lakes that typically provide more cost-effective storage.

BigQuery is a Google Cloud-managed, serverless, multicloud data warehouse that lets customers run analytics over vast amounts of data in near real time. It processes more than 110 terabytes of customers’ data every second on average, according to Google Cloud.

“We have tens of thousands of customers on it, and we invested a lot in all the governance, security and all the core capabilities, so we're taking that innovation from BigQuery and now extending it onto all the data that sits in different formats as well as in lake environments — whether it's on Google Cloud with Google Cloud Storage, whether it's on AWS or whether it's on [Microsoft] Azure,” Hasbe said.

BigLake will be at the center of Google Cloud’s data platform strategy. Image: Google Cloud

BigLake will be at the center of Google Cloud’s data platform strategy, and the cloud provider will ensure that all its tools and capabilities integrate with it, according to Hasbe.

“We are going to seamlessly integrate our data management and governance capability with Dataplex, so any data that goes into BigLake will be managed [and] governed in a consistent fashion,” he said. “All of our machine-learning and AI capabilities … will also work on BigLake, as well as all our analytics engines, whether it's BigQuery, whether it's Spark, whether it’s Dataflow.”

Enterprise data sets are growing from terabytes to petabytes, while the types of data — from structured, semi-structured and unstructured data to IoT data collected from connected devices including sensors and wearables — also are increasing. That data typically is stored across different systems with different capabilities, whether in data warehouses for structured and semi-structured data or data lakes for other types of data, creating so-called data silos that could limit access and increase costs and risks, particularly when the data must be moved.

BigLake will support all open-source file formats and standards including Apache Parquet and ORC and new formats for table access such as Iceberg, as well as open-source processing engines such as Apache Spark.

“When you think about limitless data, it is time that we end the artificial separation between managed warehouses and data lakes,” said Gerrit Kazmaier, Google Cloud’s vice president and general manager for database, data analytics and Looker. “Google is doing this in a unique way.”

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep Reading Show less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more .

Keep Reading Show less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep Reading Show less
Donna Goodison

Donna Goodison ( @dgoodison ) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep Reading Show less
Bennett Richardson

Bennett Richardson ( @bennettrich ) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep Reading Show less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins