Enterprise

Do businesses really need real-time analytics? Data startups are counting on it.

Real-time database startups enable split-second analytics and machine learning for things like financial fraud prevention, dynamic pricing or product recommendations. But do everyday enterprises really need lightning-fast real-time data?

digitization of the city concept of the future, binary number

These companies promise to leave plodding batch-data processing for old-school business intelligence analysis in the dust.

Illustration: Surasak Suwanmake/Moment/Getty Images

The term “real time” has been infused throughout tech, from real-time stock picks to real-time pizza tracking. But there’s near-real time, and then there’s real time.

As everyday enterprises begin incorporating data tools and tactics used inside the biggest of big tech companies, a sector of data services providers has emerged to help them take advantage of the truly real-time analytics and machine learning approaches only giant companies with far larger database teams and resources could have afforded in the past. Companies like Hazelcast, Rockset, Tecton and others enable split-second analytics and machine learning for things like financial fraud prevention, dynamic pricing or product recommendations that respond to what you just clicked.

These companies promise to leave plodding batch-data processing for old-school business intelligence analysis in the dust. But whether every enterprise needs, wants or is ready to operate at a clip as fast paced as a Citibank, Uber or Amazon remains to be seen.

Updating data every few days, every night or even every hour or so for business analysis using a typical batch processing approach “is like playing Monday morning quarterback,” said Venkat Venkataramani, CEO and co-founder of Rockset, a company that provides a database for building applications for real-time data, analytics and queries. “That is not going to be good enough anymore. I’m six points down, the game is not over yet — what do I do differently to change the outcome of the game?” he said, continuing a football metaphor he believes will represent more and more business scenarios involving fresh data in the next two-to-three years.

These startups believe the increasing influx of real-time data flooding into data lakes and lakehouses — from ecommerce site clicks to IoT sensor pings — will compel businesses to use that information immediately as it flows in.

Some of Hazelcast’s customers process data they consume in real time for machine-learning model-based predictive analytics used to maintain equipment on oil drilling rigs and windmills, said CEO Kelly Herrell. Tecton, which helps companies run real-time data pipelines to feed ML models, has seen insurance providers use its services to operate their driver behavior-based discount programs, according to Mike Del Balso, founder of the company.

But those are rare use cases. These startups say most of their inbound interest is coming from banking and ecommerce customers that want to prevent fraud while someone waits for a banking transaction to happen at an ATM, or to detect right away when a payment app stops working in a particular country. “Wherever there is real money and risk involved if you don’t manage something in real time” is where companies are using real-time data services, said Venkataramani.

“The top use case for us is recommendations,” said Del Balso, who said customers “want to make a recommendation based on what the user just did.” Anyone who has browsed products on an ecommerce site or scrolled through movie options on a content streaming platform knows the micro-frustrations resulting from systems that don’t recognize their most recent moves.

Machine learning helps spark real-time interest

Across the business spectrum, there are a few key factors fueling the rise of real-time data processing and analytics, and the rise of AI and machine learning is an important one. Companies want to use machine-learning systems that improve as they’re exposed to fresh information in the hopes of making smarter decisions and optimizing existing efforts in milliseconds.

While traditional business intelligence analytics efforts don’t need real-time data or processing, “real time and machine learning really go hand-in-hand,” said Gaetan Castelein, vice president of Marketing at Tecton, who explained that real-time data and machine-learning trends are converging, feeding off one another.

Consider a bank conducting millions of transactions each hour, for example. Whether or not the model decides to approve the transaction could be dependent on as many as 2,000 individual pieces of information: some relatively static, such as a zip code, and some brand new, such as a numerical amount of a cash transfer. However, data systems like Tecton’s can optimize for the most efficient approach to managing that process by separating the data pieces that are fresh from the ones that remain the same.

“Because you need to respond to the transaction really quickly, but you don’t want to be computing that stuff in real time,” Del Balso said. “It’s OK if some of the signals are a bit delayed,” he said, adding, “That becomes a performance tradeoff.”

Optimizing a recommendation engine to respond in real time to something a user just did a half-second ago “is the difference between Netflix and TikTok,” said Manish Devgan, chief product officer at Hazelcast. “As you’re browsing it’s actually updating a machine-learning model,” he said of TikTok’s content recommendation system.

In conjunction with the machine-learning boom, innovations in database architecture have also helped propel interest in using data in real time. The availability and ease of use of database technologies used for real-time data analytics such as Apache Kafka and Confluent are helping companies manage real-time data initiatives with smaller engineering teams.

More broadly, the explosion of data streaming in from online and IoT systems, coupled with adoption of the cloud and its cost efficiencies, is also sparking interest in using data in real time.

“As more enterprises continue to migrate to the cloud and invest in digital transformations, the volume, variety and velocity of machine-generated data — clickstream, logs, metrics, IoT — will proliferate exponentially,” said Derek Zanutto, general partner at CapitalG, who added that the Google investment arm does not currently have any companies in the real-time data processing space.

“As the volume of machine-generated data continues to proliferate, forward-thinking, data-driven organizations will increasingly seek out opportunities to mine this data for real-time operational analytics use cases that help them maintain or improve their market leadership,” Zanutto said.

When near-real time is good enough

There’s a difference between what’s real time and what’s merely in the ballpark, according to database experts. “People use the word ‘real time’ in a very abusive manner,” said Ravi Mayuram, chief technology officer at Couchbase, a database company that enables real-time data processing.

He and others say if data analysis happens in minutes rather than seconds or split seconds, it’s not real time; it’s just near-real time. Venkataramani said he defines real-time data processing as something that takes less than two seconds.

“This is kind of complicated and nuanced, so it’s easy to mix together real time and near-real time,” Del Balso said, adding that for most companies and use cases, near-real time is good enough.

Indeed, some experts say processing data every few minutes should suffice for many businesses.

“There are extreme ends of this where you really, really need [real time],” said Ryan Blue, co-founder and CEO of data platform startup Tabular, and a former Netflix database engineer who helped build Iceberg, a core data architecture used for lakehouse-style analytics. “The question is, when is a five-minute batch process sufficient?” Blue said.

Some Rockset customers don’t even use the company’s most extreme real-time data capabilities. Seesaw, an online learning platform, uses Rockset to enable analytics, data visualizations and data queries. But for now, said Emily Voigtlander, Seesaw’s product manager, batch processing every night is just fine. While she did not rule out future needs for Rockset’s real-time data services, Voigtlander said, “It’s not actually what is most essential to our business right now.”

But just wait, some say. Today, companies that are still getting a handle on batch processing might decide to leapfrog the competition, said Preeti Rathi, general partner at venture capital firm Icon Ventures. Those kinds of companies might ask, “If we can directly just jump here, why not?” she said.

The growing interest in real-time analytics and data processing represents what Gerrit Kazmaier, Google Cloud's vice president and general manager for Database, Data Analytics and Looker, called a “paradigm shift” away from traditional data stacks to systems that “connect the systems of intelligence” to applications that let companies influence customer behavior or take action using machine learning and analytics on the spot.

“So now, you come to a tipping point, where suddenly the strategic platform of the enterprise is not anymore the functional system, it’s the data system,” he said.

Entertainment

Niantic’s future hinges on mapping the metaverse

The maker of Pokémon Go is hoping the metaverse will deliver its next big break.

Niantic's new standalone messaging and social app, Campfire, is a way to get players organizing and meeting up in the real world. It launches today for select Pokémon Go players.

Image: Niantic

Pokémon Go sent Niantic to the moon. But now the San Francisco-based augmented reality developer has returned to earth, and it’s been trying to chart its way back to the stars ever since. The company yesterday announced layoffs of about 8% of its workforce (about 85 to 90 people) and canceled four projects, Bloomberg reported, signaling another disappointment for the studio that still generates about $1 billion in revenue per year from Pokémon Go.

Finding its next big hit has been Niantic’s priority for years, and the company has been coming up short. For much of the past year or so, Niantic has turned its attention to the metaverse, with hopes that its location-based mobile games, AR tech and company philosophy around fostering physical connection and outdoor exploration can help it build what it now calls the “real world metaverse.”

Keep Reading Show less
Nick Statt

Nick Statt is Protocol's video game reporter. Prior to joining Protocol, he was news editor at The Verge covering the gaming industry, mobile apps and antitrust out of San Francisco, in addition to managing coverage of Silicon Valley tech giants and startups. He now resides in Rochester, New York, home of the garbage plate and, completely coincidentally, the World Video Game Hall of Fame. He can be reached at nstatt@protocol.com.

Every day, millions of us press the “order” button on our favorite coffee store's mobile application: Our chosen brew will be on the counter when we arrive. It’s a personalized, seamless experience that we have all come to expect. What we don’t know is what’s happening behind the scenes. The mobile application is sourcing data from a database that stores information about each customer and what their favorite coffee drinks are. It is also leveraging event-streaming data in real time to ensure the ingredients for your personal coffee are in supply at your local store.

Applications like this power our daily lives, and if they can’t access massive amounts of data stored in a database as well as stream data “in motion” instantaneously, you — and millions of customers — won’t have these in-the-moment experiences.

Keep Reading Show less
Jennifer Goforth Gregory
Jennifer Goforth Gregory has worked in the B2B technology industry for over 20 years. As a freelance writer she writes for top technology brands, including IBM, HPE, Adobe, AT&T, Verizon, Epson, Oracle, Intel and Square. She specializes in a wide range of technology, such as AI, IoT, cloud, cybersecurity, and CX. Jennifer also wrote a bestselling book The Freelance Content Marketing Writer to help other writers launch a high earning freelance business.
Climate

Supreme Court takes a sledgehammer to greenhouse gas regulations

The court ruled 6-3 that the EPA cannot use the Clean Air Act to regulate power plant greenhouse gas emissions. That leaves a patchwork of policies from states, utilities and, increasingly, tech companies to pick up the slack.

The Supreme Court struck a major blow to the federal government's ability to regulate greenhouse gases.

Eric Lee/Bloomberg via Getty Images

Striking down the right to abortion may be the Supreme Court's highest-profile decision this term. But on Thursday, the court handed down an equally massive verdict on the federal government's ability to regulate greenhouse gas emissions. In the case of West Virginia v. EPA, the court decided that the agency has no ability to regulate greenhouse gas pollution under the Clean Air Act. Weakening the federal government's powers leaves a patchwork of states, utilities and, increasingly, tech companies to pick up the slack in reducing carbon pollution.

Keep Reading Show less
Brian Kahn

Brian ( @blkahn) is Protocol's climate editor. Previously, he was the managing editor and founding senior writer at Earther, Gizmodo's climate site, where he covered everything from the weather to Big Oil's influence on politics. He also reported for Climate Central and the Wall Street Journal. In the even more distant past, he led sleigh rides to visit a herd of 7,000 elk and boat tours on the deepest lake in the U.S.

Fintech

Can crypto regulate itself? The Lummis-Gillibrand bill hopes so.

Creating the equivalent of the stock markets’ FINRA for crypto is the ideal, but experts doubt that it will be easy.

The idea of creating a government-sanctioned private regulatory association has been drawing more attention in the debate over how to rein in a fast-growing industry whose technological quirks have baffled policymakers.

Illustration: Christopher T. Fong/Protocol

Regulating crypto is complicated. That’s why Sens. Cynthia Lummis and Kirsten Gillibrand want to explore the creation of a private sector group to help federal regulators do their job.

The bipartisan bill introduced by Lummis and Gillibrand would require the CFTC and the SEC to work with the crypto industry to look into setting up a self-regulatory organization to “facilitate innovative, efficient and orderly markets for digital assets.”

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.

Enterprise

Alperovitch: Cybersecurity defenders can’t be on high alert every day

With the continued threat of Russian cyber escalation, cybersecurity and geopolitics expert Dmitri Alperovitch says it’s not ideal for the U.S. to oscillate between moments of high alert and lesser states of cyber readiness.

Dmitri Alperovitch (the co-founder and former CTO of CrowdStrike) speaks at RSA Conference 2022.

Photo: RSA Conference

When it comes to cybersecurity vigilance, Dmitri Alperovitch wants to see more focus on resiliency of IT systems — and less on doing "surges" around particular dates or events.

For instance, whatever Russia is doing at the moment.

Keep Reading Show less
Kyle Alspach

Kyle Alspach ( @KyleAlspach) is a senior reporter at Protocol, focused on cybersecurity. He has covered the tech industry since 2010 for outlets including VentureBeat, CRN and the Boston Globe. He lives in Portland, Oregon, and can be reached at kalspach@protocol.com.

Latest Stories
Bulletins