Upended by the cloud: How 50-year-old data giant Acxiom learned to accept change

For 53-year-old data services giant Acxiom, the question is not just how to move its data and its customers’ data to the cloud. It’s how to do it without jeopardizing its raison d’être.

Upended by the cloud: How 50-year-old data giant Acxiom learned to accept change

The effect of Acxiom's cloud migration was more than cultural. It forced an upheaval of how the company did business.

Illustration: Christopher T. Fong/Protocol

There was a barista station with a cappuccino robot, and guests were treated to a Michelin-caliber spread for breakfast and lunch. But despite the luxe accommodations, there was something nagging visitors from Acxiom when Google Cloud execs hosted people from the data services giant at an all-day briefing at Google HQ in 2019.

As much as Acxiom’s cloud converts believed in its promise of speed, efficiency and connectivity, they knew that moving to the cloud could cost far more than the price of data transfer and compute. It could mean cannibalizing the very business the Arkansas company – famous for its 160,000-square-foot, tornado-proof data center – had been in for much of its 53 years.

Acxiom’s data center services once brought in a significant portion of its revenue, but costs grew as the AI rush required more data storage and speedier processing to enable machine learning and other analytics. Salespeople and others were reluctant to welcome Silicon Valley cloud partnerships that could continue to eat away at the good ol’ days of data center revenues and steak dinners in Little Rock with clients.

Plus, Acxiom’s cloud-faithful would have to convince the rest of the company that a more deliberate move to the cloud was the right one.

“There’s still a lot of old-timers who I think definitely feel that way: that they’re slitting their own throat if they go all-in on cloud,” said Mark Donatelli, managing partner at marketing data technology consultancy Cimply. Donatelli worked on Acxiom’s data services product team from 2009 to 2013, and attended that 2019 Google meeting while working for Acxiom’s parent company, ad agency conglomerate Interpublic Group.

Today Acxiom’s corporate blog posts tout a “cloud-first” approach, but neither Google nor its competitors have an exclusive on that business. The company has relationships with AWS, Google Cloud and Snowflake, as well as other cloud services, including Databricks and MongoDB.

“There’s still a lot of old-timers who [think] they’re slitting their own throat if they go all-in on cloud.” —Mark Donatelli, managing partner at Cimply

Though corporate brand-speak may imply it’s simply a matter of flipping a “cloud-first” switch, figuring out the logistics and business rationale for each piecemeal transition to the cloud is something quite different.

For Acxiom, which is essentially a private cloud company, the question is not just how to move its clients’ customer data and its own proprietary data products and services from inside data center servers to big cloud platforms. It’s how to do that without jeopardizing Acxiom’s decades-long raison d’être.

A piecemeal transition

By the time its execs were invited out for Google’s dog and pony show, Acxiom had already spent a few years evaluating and testing how to implement a gradual shift to the cloud. When Bhavna Godhania, senior director of Strategic Partnerships at Acxiom, started with the company in 2017, she said, “There was already a discovery and analysis core team that would secretly meet, and they had everyone from industry analysts to third-party consulting companies, to technology partners who had previously embarked on their cloud journeys earlier.”

Cloud believers inside the company had been thinking for even longer about how they would need to transform existing data workflows and rebuild systems to ensure they worked when customers used data and systems in the cloud. They all knew migration would be painful: rebuilding data pipelines, upending contracts and going through stringent compliance certifications all over again.

It’s no wonder a lot of Acxiom’s clients have yet to move their proprietary data about their customers, such as phone numbers, emails, purchase history, credit card transactions and website clicks, to the cloud. “The vast majority of clients, across the board — they still run on-prem,” said Eugene Becker, general manager of Data and Identity at Acxiom, referring to customers who store their data and applications in an Acxiom data center, their own data center or a hybrid of the two.

Acxiom must move at the pace of its customers, but the company does have more control over decisions to make its own packaged data and services available on cloud platforms, in order for it to connect more readily with the marketing and customer management cloud applications clients use. The consumer data that Acxiom sells to marketers showing the types of cars people drive, their household income levels or whether they travel or buy antiques is available in AWS’s and Snowflake’s cloud data marketplaces.

Clients can already use the cloud to access the brand-specific private identity graphs Acxiom assembles to connect personally identifiable customer data to other information about them. Those systems are separate from the identity services provided by LiveRamp, which was once an Acxiom subsidiary but split from the company in 2018 and now still partners with it.

Other Acxiom services are in the process of being moved to the cloud. Right now, Acxiom’s U.S. Postal Service-certified data hygiene service is getting set up on AWS, for example. Customers use the system to clean inaccurate contact information to help ensure that physical mail, email and targeted personalized ads go to the right place. In the future, that service could be available on other clouds outside AWS.

“There may be a point where we look at economics and maybe have a hybrid cloud architecture where it's more cost-effective to leave it in one cloud and access it from another cloud,” said Kyle Hollaway, senior vice president of Global Identity at Acxiom.

“[If] you're going to build something from scratch, building to the cloud makes a lot of sense.” —Eugene Becker, general manager of Data and Identity at Acxiom

Inside Acxiom, people talk about how they can’t just “lift and shift.” Today, portions of Acxiom’s identity infrastructure run out of its data centers, while some run in AWS, Becker said. “If you're going to move your data to the cloud, you have to rebuild the solutions,” he said, adding that eventually all of Acxiom’s identity infrastructure will be accessible in AWS.

“It requires a bunch of work, and the advantages are kind of strategic, but if you start to look at the business case, it's not overwhelming in terms of cost savings,” he continued, adding, “[If] you're going to build something from scratch, building to the cloud makes a lot of sense.”

But deciding whether to rebuild an existing application requires a different set of criteria. “Now, to further complicate that, suppose you have a bunch of hardware in a data center that's already been paid for. [That] hardware, for some period of time until it breaks down, could be more efficient than the cloud,” Becker said. “You really have to do the math on the migration to figure out how and when you’re going to end up making money, and the calculus really varies.”

Data security and privacy are major deciding factors for Acxiom, too. As a legacy data broker scrutinized over the years for having too much control over too much private information about too many people, the company has meticulously refined its data security, governance and cross-border data transfer flows inside its own systems. “We do it so well on-prem — it’s unbeatable. Can we say the same when we do that in the cloud, and how much testing is enough when it comes to data and data protection?” Godhania said.

Ultimately, as much as the company might trust its cloud partners, the buck stops with Acxiom if data flowing through its applications in the cloud is compromised, she said. “We ask ourselves, what are the benefits? We get speed, we get flexibility, maybe cost down the road, but then weighted against the potential harms for consumers, what could happen if the data was subjected to unauthorized access in these clouds?” Godhania said.

Capital One: It’s ‘showtime’

Some Acxiom customers are particularly exacting about how their data is managed and where it lives. Some stipulate in their contracts that their data must be stored in servers inside the company’s Conway, Arkansas data center as opposed to one of its other locations. Some customers even send inspectors to assess the physical equipment and security apparatuses, something they typically can’t do when it comes to cloud providers’ physical systems.

“They could send somebody down. They can touch the walls. They can crawl under the floor if they wanted to, touch the ceiling. They knew the constraints, saw the cameras — the cloud didn't provide that,” said a former Acxiom data technology exec who was with the company for nearly 26 years until 2021, and asked not to be named.

In general, certain types of customers have adopted the cloud before others. “Automotive, retailers, travel and entertainment — those are verticals who have adopted cloud-based solutions faster than more regulated industry verticals like financial services, insurance, health care, things like that,” said Chad Engelgau, Acxiom’s CEO. Engelgau’s been at Acxiom and IPG-owned marketing and data companies IPG Mediabrands and Kinesso since 2006.

“That journey had been going on for a while, and then once ‘Cap One’ migrated, that really was like, ‘OK, showtime.” —Bhavna Godhania, senior director of Strategic Partnerships at Acxiom

Capital One is an outlier among Acxiom’s financial services customers, though, and insiders say the company had an important influence on Acxiom’s cloud-related efforts. During several transitional years leading up to 2020, when the bank completed its migration to AWS and Snowflake, Capital One’s move to the cloud kicked Acxiom’s cloud efforts into higher gear, in part because it was a large customer with lots of moving parts and strict security demands.

“That journey had been going on for a while, and then once ‘Cap One’ migrated, that really was like, ‘OK, showtime,’” Godhania said.

“It got serious,” said Donatelli, who said that before Capital One decided to go to the cloud, “everybody was talking about it, but no one was doing it in earnest.”

Capital One declined to comment for this story.

‘Down-home' times dwindle for Acxiom sales

When Acxiom bought digital identity company LiveRamp in 2014, it was a visible sign that the old-school data broker was breaking away from its Arkansas roots, excited by the lure of Silicon Valley tech. Acxiom’s longtime chairman and CEO Charles Morgan helped grow the company’s size and footprint to 7,000 employees in nine countries, managing 6 petabytes of data storage by the time he resigned in 2007. But since leaving, Morgan has lamented what he saw as a departure from Acxiom’s traditional culture and customer-centric roots.

"I feel like Acxiom has moved with their CEO and executive team mostly to California now. They have mostly abandoned us here, which I think was a mistake," Morgan told the Northwest Arkansas Democrat-Gazette in 2017, remarking on a decision by then-CEO Scott Howe to sell Acxiom’s office building in Little Rock. Howe is now CEO of LiveRamp.

A member of the Arkansas Business Hall of Fame, Morgan reminisced in the Democrat-Gazette article about the meaning of the building to the company and its customers. "It helped us grow our relationships with these guys," he said. "We didn't have to drive them up to Conway. We could just take them off the airplane and zip them over to this nice briefing center. And then, of course, we did things like take them to Doe's for steaks and a good, down-home time,” he said. He was talking about Doe's Eat Place, a legendary Arkansas restaurant where diners share 2- and 3-pound T-bones and porterhouse steaks dished up family-style – a far cry from a cappuccino robot.

data center technician walking down hallway of server racks in an acxiom data center An Acxiom data center in an undisclosed location.Acxiom

A tech built for the cloud infringed on the old way of doing things, and its effect on Acxiom was more than cultural. It forced an upheaval of how the company did business.

Some salespeople perceived the cloud as a threat to their livelihood. Accustomed to selling three-year, flat-fee package deals of hardware, software and data management services, their standard contracts weren’t set up for the cloud. Acxiom’s data center management accounted for a substantial portion of contracts, and as customers fed more data into machine learning and business analytics tools, the costs grew.

As the company struggled to compete with cloud platforms designed for those purposes – sometimes at a lower cost – Acxiom would have to rationalize the value of its data infrastructure to clients. And when customers would insist Acxiom merely serve as a middleman between themselves and a chosen cloud provider, it could shrink Acxiom’s revenue by hundreds of thousands of dollars or more each year.

“People would [ask], ‘What's this going to do for my commission? What's this going to do for my bonus? What's this going to do for my career?’” said the former Acxiom exec. He said he eventually left the company in part because it wasn’t moving fast enough to the cloud. “There was a lot of resistance.”

Acxiom has revised arrangements with clients as contracts are renewed in the cloud age. “We do believe when solutions are effectively architected, and the availability of all the necessary components exists within a cloud, we can architect solutions that can lower customer costs, while creating greater flexibility and access to compute and storage, as well as access to a lot of really cool technology that is becoming native in the cloud,” said Engelgau.

Google gets a piece

As the cloud disruption struck fear in the hearts of salespeople back in Arkansas and beyond, Google Cloud put on the charm at its all-day 2019 schmooze-fest for Acxiom. Execs sipped coffee topped with Acxiom logo-adorned foam and listened as Google’s data, engineering and analytics leaders pitched the company’s cloud capabilities and product plans.

“They give you the dog and pony. All these different groups discussed all these capabilities of these tools, the road maps of all these products that were all very much aligned with the kinds of things that Acxiom needs in order to build these databases and to distribute data,” Donatelli recalled about the Google meeting.

Eventually, Google did get an Acxiom win. The company is the primary cloud partner for Acxiom’s Intelligence Hub. Launched in 2021, it’s a home base for ad- and marketing-related applications built by Acxiom data scientists and used by marketing customers for purposes like optimizing ad targeting or visualizing data.

Google, which attracts a significant portion of the world’s digital advertising dollars and controls many of the data pipes advertisers rely on, made sense as the cloud partner for the applications hub, said Shea Heath, Acxiom’s general manager of Solutions and Analytics. “Access and speed of information — with everything being within Google — you can do more things in real-time,” Heath said. “[Google tends] to protect their data. So now we can work in their environment.”

It’s not likely that any single cloud provider will ever get an exclusive deal with Acxiom. But because Acxiom is a marketing-focused data company, Google’s dominance in the ad industry could lead to more tie-ins between the two.

Setting up Acxiom’s services and applications to run on Google’s infrastructure and integrate with its marketing platform to enable data connections needed to target and measure ads delivered by Google, "is fundamentally where the market is going. And therefore, Acxiom absolutely will be there, and is there for a number of use cases today,” Engelgau said.

But whether Acxiom will someday ditch its data centers entirely as big cloud providers like AWS and Google grab more of its data management business is anyone’s guess, even Engelgau’s. “Truthfully, we will probably have our own data centers, and will continue to operate and run client solutions in those for five, eight, 10 years, probably – maybe forever – not sure,” Engelgau said.

These days though, few Acxiom employees need to be, or are even allowed, inside the hulking structures. As Hollaway put it, “I do not physically have to go into the data center. I drive by and wave.”


Niantic’s future hinges on mapping the metaverse

The maker of Pokémon Go is hoping the metaverse will deliver its next big break.

Niantic's new standalone messaging and social app, Campfire, is a way to get players organizing and meeting up in the real world. It launches today for select Pokémon Go players.

Image: Niantic

Pokémon Go sent Niantic to the moon. But now the San Francisco-based augmented reality developer has returned to earth, and it’s been trying to chart its way back to the stars ever since. The company yesterday announced layoffs of about 8% of its workforce (about 85 to 90 people) and canceled four projects, Bloomberg reported, signaling another disappointment for the studio that still generates about $1 billion in revenue per year from Pokémon Go.

Finding its next big hit has been Niantic’s priority for years, and the company has been coming up short. For much of the past year or so, Niantic has turned its attention to the metaverse, with hopes that its location-based mobile games, AR tech and company philosophy around fostering physical connection and outdoor exploration can help it build what it now calls the “real world metaverse.”

Keep Reading Show less
Nick Statt

Nick Statt is Protocol's video game reporter. Prior to joining Protocol, he was news editor at The Verge covering the gaming industry, mobile apps and antitrust out of San Francisco, in addition to managing coverage of Silicon Valley tech giants and startups. He now resides in Rochester, New York, home of the garbage plate and, completely coincidentally, the World Video Game Hall of Fame. He can be reached at nstatt@protocol.com.

Every day, millions of us press the “order” button on our favorite coffee store's mobile application: Our chosen brew will be on the counter when we arrive. It’s a personalized, seamless experience that we have all come to expect. What we don’t know is what’s happening behind the scenes. The mobile application is sourcing data from a database that stores information about each customer and what their favorite coffee drinks are. It is also leveraging event-streaming data in real time to ensure the ingredients for your personal coffee are in supply at your local store.

Applications like this power our daily lives, and if they can’t access massive amounts of data stored in a database as well as stream data “in motion” instantaneously, you — and millions of customers — won’t have these in-the-moment experiences.

Keep Reading Show less
Jennifer Goforth Gregory
Jennifer Goforth Gregory has worked in the B2B technology industry for over 20 years. As a freelance writer she writes for top technology brands, including IBM, HPE, Adobe, AT&T, Verizon, Epson, Oracle, Intel and Square. She specializes in a wide range of technology, such as AI, IoT, cloud, cybersecurity, and CX. Jennifer also wrote a bestselling book The Freelance Content Marketing Writer to help other writers launch a high earning freelance business.

Supreme Court takes a sledgehammer to greenhouse gas regulations

The court ruled 6-3 that the EPA cannot use the Clean Air Act to regulate power plant greenhouse gas emissions. That leaves a patchwork of policies from states, utilities and, increasingly, tech companies to pick up the slack.

The Supreme Court struck a major blow to the federal government's ability to regulate greenhouse gases.

Eric Lee/Bloomberg via Getty Images

Striking down the right to abortion may be the Supreme Court's highest-profile decision this term. But on Thursday, the court handed down an equally massive verdict on the federal government's ability to regulate greenhouse gas emissions. In the case of West Virginia v. EPA, the court decided that the agency has no ability to regulate greenhouse gas pollution under the Clean Air Act. Weakening the federal government's powers leaves a patchwork of states, utilities and, increasingly, tech companies to pick up the slack in reducing carbon pollution.

Keep Reading Show less
Brian Kahn

Brian ( @blkahn) is Protocol's climate editor. Previously, he was the managing editor and founding senior writer at Earther, Gizmodo's climate site, where he covered everything from the weather to Big Oil's influence on politics. He also reported for Climate Central and the Wall Street Journal. In the even more distant past, he led sleigh rides to visit a herd of 7,000 elk and boat tours on the deepest lake in the U.S.


Can crypto regulate itself? The Lummis-Gillibrand bill hopes so.

Creating the equivalent of the stock markets’ FINRA for crypto is the ideal, but experts doubt that it will be easy.

The idea of creating a government-sanctioned private regulatory association has been drawing more attention in the debate over how to rein in a fast-growing industry whose technological quirks have baffled policymakers.

Illustration: Christopher T. Fong/Protocol

Regulating crypto is complicated. That’s why Sens. Cynthia Lummis and Kirsten Gillibrand want to explore the creation of a private sector group to help federal regulators do their job.

The bipartisan bill introduced by Lummis and Gillibrand would require the CFTC and the SEC to work with the crypto industry to look into setting up a self-regulatory organization to “facilitate innovative, efficient and orderly markets for digital assets.”

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.


Alperovitch: Cybersecurity defenders can’t be on high alert every day

With the continued threat of Russian cyber escalation, cybersecurity and geopolitics expert Dmitri Alperovitch says it’s not ideal for the U.S. to oscillate between moments of high alert and lesser states of cyber readiness.

Dmitri Alperovitch (the co-founder and former CTO of CrowdStrike) speaks at RSA Conference 2022.

Photo: RSA Conference

When it comes to cybersecurity vigilance, Dmitri Alperovitch wants to see more focus on resiliency of IT systems — and less on doing "surges" around particular dates or events.

For instance, whatever Russia is doing at the moment.

Keep Reading Show less
Kyle Alspach

Kyle Alspach ( @KyleAlspach) is a senior reporter at Protocol, focused on cybersecurity. He has covered the tech industry since 2010 for outlets including VentureBeat, CRN and the Boston Globe. He lives in Portland, Oregon, and can be reached at kalspach@protocol.com.

Latest Stories