Data centers aren’t dead. But they’ll never look the same again.
Major cloud providers have accepted a future where hybrid cloud strategies take center stage, which means on-premises infrastructure will continue to evolve. Here's how.
A funny thing happened along the way to the cloud-only future we were promised a decade ago: Turns out, for many applications, the old-fashioned way of doing business on the internet works just fine.
Cloud vendors and customers have reevaluated their infrastructure strategies around the hybrid cloud in recent years, and that means self-managed data centers are going to be with us for a long time to come. Even AWS, the pioneer of the cloud market, now offers customers a rack of physical servers designed around AWS services that they can use in their data centers, just a few years after scoffing at the notion that anyone would want to manage their own equipment.
When the biggest evangelists in cloud computing have accepted the idea that self-managed data centers have a role in the future of infrastructure tech, it's time to start thinking about how those data centers will evolve. ""We think of hybrid infrastructure as including the cloud along with various other edge nodes, on-premises data centers being one of them," said no less a cloud authority than AWS CEO Andy Jassy during his re:Invent keynote Tuesday.
The appeal of hybrid cloud is pretty clear. It's not hard to imagine how startups that turned into massive companies on the back of cloud computing might take a page from Dropbox circa 2015, when it decided to build its own infrastructure and in the process saved almost $75 million in the years leading up to its IPO. As edge computing and 5G networks start to actually make an impact, smaller, widely-distributed data centers could become the best way to serve customers. And for lots of companies, regulations around data handling, security concerns, and good old-fashioned latency issues make it necessary to locate their computing infrastructure as close as possible to their end users anyway. In short, the enterprise data center is alive and well.
Like anything in the enterprise tech market, self-managed data centers will have to keep up with the times, allowing for changes in server performance, energy efficiency and networking techniques to avoid falling too far behind the steady upgrades that deep-pocketed cloud providers will make to their own hardware over the next decade. And the cloud is more than just someone else's server: Cloud computing has given rise to an entirely new way of building software, and applications built around hybrid architectures will still need to incorporate some of those lessons.
But "the future of infrastructure is everywhere," said David Cappuccio, an analyst with Gartner. Large enterprise companies are increasingly realizing, he added, that "their compute is not going to be on-premises, it's not going to be in the cloud, it's not going to be at the edge, it's going to be everywhere. It really depends on what the business requirements are."
Don't do it all yourself
Many companies that want or need to manage their own servers are at least looking to get out of the building-maintenance business. Instead, they're seeking out data center providers like Equinix or Digital Realty that provide space, power, cooling systems and security to customers that bring their own computing hardware.
The electrical costs alone for powering fleets of servers and the cooling equipment needed to prevent them from melting can be as expensive as the equipment itself, said Rob Johnson, CEO of Vertiv, which makes power supplies, cooling equipment and many other components needed inside a modern data center.
Colocation providers take care of those details — for a fee, of course. But in lots of cases, it's cheaper to pay a colo provider than to employ staff to source and manage that equipment. It also helps companies move expensive real-estate holdings off their books.
These colocation providers also tend to be located alongside high-bandwidth networking connections that aren't necessarily available in your garden-variety office park or downtown building complex. And, increasingly, they are forging connections with cloud providers to provide customers with "on-ramps" into public cloud services that can make it easier to maintain a hybrid cloud strategy, Cappuccio said.
For companies determined to keep their own buildings, the newest trend is scaling up: literally.
Most data centers are located in suburban or rural areas with plenty of land to build cavernous buildings just a few stories high. But land is getting more expensive around the world, so data center buildings are getting taller, which changes the economics of owning and operating your own data center.
This trend allows companies to add floor space as needed, rather than building an enormous building that might only run at 50% capacity most of the time, Cappuccio said.
Big box data centers
As it starts to take shape, edge computing has often been defined in terms of industrial applications, like smart factories, or the wireless networking strategies that cloud providers and companies like Verizon and AT&T are developing. But companies are starting to think about putting computing resources in some unexpected places.
Dell, FedEx and data-center operator Switch announced a new partnership early November to build a cloud service that uses FedEx distribution hubs as mini data-centers for customers.
"These are big facilities with large exterior walls that you can put a Switch MOD 15 [modular data center unit] on and utilize our existing real-estate footprint and simply add to it this kind of compute capacity and make it generally available," Rob Carter, FedEx CIO, told Protocol. "The way that we've looked at it is [that] a 60-kilometer radius around these facilities can be served with millisecond latency for storage, compute, and network capacity ... in urban metro and suburban industrial centers."
While that approach might not work for everyone, it hits on an idea that many companies, especially in the retail industry, are considering.
Some retail companies were already skeptical about the cloud, or least AWS, loathe to give their nemesis Amazon any capital to further their demise. More and more, they're starting to realize that they can build out small data centers within their own big-box retail stores as well as distribution centers to build out small data centers within those existing facilities, Johnson said.
"We're beginning to see that edge build out in a way we haven't seen before," he said. The pandemic has forced retailers to get better at shipping directly to local customers, and while they are upgrading their local stores in line with that new mission, it's an ideal time to add computing capacity as well.
"We can't get away with the fact that we're going to see more and more data centers on the edge as latency becomes a bigger issue," Johnson said.
Rohit Dixit of HPE, a company that has been helping stock data centers with gear for more than two decades, agreed with Johnson's assessment.
"Something like 70% of all data generated in the future will never see a [traditional] data center," he said. Instead, more and more data will be generated at the edge, ushering in the era of a "hub and spoke" data center model with a scaled-down version of the traditional data center in the middle and smaller, purpose-built data centers at the edge of the network.
None of these trends are likely to slow the runaway cloud infrastructure train: AWS is still growing faster than almost all of the traditional players, and there are still a surprisingly large number of companies that have yet to take any step toward modernization.
Those companies will find cloud services that make sense for their business needs, and for startups on a growth curve the cloud still probably makes the most sense. But all-in cloud strategies are likely to be the exception, rather than the rule, when it comes to the next wave of modernization.
Kevin McAllister contributed reporting.