StackPath CEO Kip Turco has an easy reference when asked about the potential benefits of edge computing platforms: Taylor Swift.
Whenever Swift returns to live performances as the pandemic abates, demand for tickets to her shows could be unprecedented. Hundreds of thousands of people will be competing for seats against bots and professional ticket buying operations sure to swarm Ticketmaster's services, some of which run in AWS regional data centers.
"Milliseconds matter," Turco told Protocol, when it comes to a situation like the release of Taylor Swift tickets. A fan request to purchase tickets will have to travel to that regional data center and back, and even the smartest minds in enterprise tech haven't figured out how to increase the speed of light.
But edge computing, or decentralized computing, could change the way businesses offering time-sensitive services build and deploy those apps. By moving a modest amount of computing power closer to end users and connecting it with a super-efficient network, Turco thinks StackPath's services could help level the playing field, especially as the underwhelming rollout of 5G wireless networks really starts to gain traction.
StackPath, which has raised nearly $400 million since it was founded in 2015 to build out its content-delivery network, thinks it will be a few years before the market is ready for this technology. But lots of companies are planning for a world in which computing resources start to move away from centralized cloud data centers and closer to the edge of the network, and Turco shared his thoughts on how this market will evolve in a recent interview.
This interview has been edited and condensed for clarity.
StackPath CEO Kip Turco sees edge computing as the future.Photo: StackPath
What's the most clear example of why edge computing is needed today?
We tend to agree that the potential from an edge compute marketplace, we really view as kind of like a 2024, '25, '26, '27 evolution. Some of that is associated with the timeline of things that we're doing internally, but more of it I think is based on the marketplace.
The drivers that will open up that marketplace are things like 5G; as you know, everyone's advertising 5G, but you know as well as I do [that] the train has left the station, but it's not even halfway there. So it's the evolution of 5G, it's the continued development of IoT products, and associated AI with those IoT products closer to the end user.
And then fourth, the only thing that I think really helped accelerate it this year, COVID; not COVID itself, but creating that diverse work environment, which has changed how everyone's thinking about applications in the network and their proximity to the end users.
What kind of computing resources can you expect at the edge?
We think about it in terms of, generally speaking, data in motion or data at rest. If you think of data at rest, that's like a typical primary [server], whether it's in a hyperscaler like Google, Microsoft, or Amazon, or your own core data center. That's where you're keeping critical company data.
Processing and storing the data in motion, which I believe is what we're helping to solve for our edge use case, is the data that needs to be quickly accessed, stored and analyzed and then dumped, that you don't need to send back.
Here in Brookline near Boston, there's not a straight street in the whole damn town. And you have all the weather conditions, you have all the different traffic that changes on an hourly basis, and then you roll something forward like an autonomous car. To drive around and make a determination like "Hey, there's an accident up here," or "Hey, it's gonna start to snow, we need to go left or right," applications like that for an autonomous car to get a reliable, millisecond-quick decision are the data sets that we see moving towards [the edge].
It sounds like most of those are not going to be sort of your typical corporate applications.
From my perspective, the main corporate applications that would need to reside out there are ones that are time- or latency-sensitive. So for example, one of the applications that we're working on to be deployed at the edge is a VDI [virtual desktop infrastructure] solution. And this company made a VDI investment, but has struggled having their employees use it. Now, we've taken that VDI and they've rewritten that application so it's a bit lighter. And we're deploying that application closer to their employees or end users, and to marketplaces, to see if it works better for them.
What kind of network infrastructure investment do you need to have in order to really support this?
I believe when you roll it all forward, the more critical component of it will be network than actual processing or storage. We were initially a network-driven CDN company, so for us, connectivity was the star in the choosing point for all our locations.
The hyperscalers are attacking edge compute going out from their huge centralized clouds closer to their end users. I feel like someone like us or Fastly are doing the same thing, but we're starting by virtue of what we did with CDN out of the internet edge. We're attacking the edge compute market from different directions.
We're not trying to be everywhere on the globe, we're trying to be where we think are the 50, to top 100 places that you need to be to have a premier global network. And from there, we'll distribute to the edge as our customers need us to or as they would want us to, instead of trying to build out 3,000 POPs [points of presence] globally all over the place and throw dots all over a map and hope we're in the right place.
So certainly you know the hyperscalers have invested a lot in networking capabilities. When you think about their private transit networks, they're some of the best in the world. That would appear to give them a fair amount of capacity and capability to be able to extend these types of services and ideas pretty far out onto the edge as well.
There's no doubt that they have expansive networking capabilities. But at a high level, I think a majority of those are from those hyperscaler pods to other massive [internet exchanges] or data centers. Where the network that we've built or someone like Fastly is built, our networks are more expansive from the end user: that TV sitting in the house, a mobile phone in your hand or the car that you're driving around out towards the internet edge.
I'm not trying to compete with Google, Amazon, or Microsoft. I am trying to coexist to make sure that our platform or edge POPs are interoperable with all of those different providers or possible competitors.
From an application architectural standpoint, is there a lot of work that needs to be done to sort of make those applications edge-sensitive?
I'd say it's a fair amount of work. A majority of the work revolves around decoupling all of their applications to really be written in environments where you have data at rest, like in a large mammoth centralized data center, to a lighter-weight version of that as data in motion that is pushed out from the centralized location, and just gradually updates the core app. I don't think it's super easy, but I don't think it's super complex.
I'm thinking about the companies who are rewriting applications now for cloud, so if you make that effort to do that for cloud, then you have to make an additional effort to do it for edge...
I don't think it's going to be nearly as hard as writing an application to the cloud.
What are people overlooking about edge and decentralized computing?
To me, it's like we're back 15 years ago doing the whole cloud thing, where it's moved to the market and it's about how we get there and when we get there. I think it's going to be similar in that we have an evolution in technology, which is going to enable folks to think about their applications differently, and run them more efficiently in the future.
Even something crazy simple; my son, who was huge into baseball. I remember [a few years ago], we'd be coming back from baseball practice and I'd be like, "Let's get McDonald's on the way home," which when I was a kid that's like, hands down, yes, I'm all over it. And he's like, "No, we've got to go home."
He's looking to jump on the Xbox to play Fortnite. The reason none of these kids play something like Fortnite on the mobile phone is because the experience sucks because the throughput is bad. As 5G opens up, that's going to push companies to rethink their delivery model differently for the economic benefit from it.
Two years ago, my daughters were like, "Taylor Swift is coming to Gillette [Stadium], we're going to all stay up and we need everyone's device in the home, you can't go out of the home because it's not quick enough to sign up to get Taylor Swift tickets at 12:01 a.m." And I'm sitting there thinking to myself we've got zero chance, because they're all being served out of the Amazon data center probably on the East Coast, either in New York or in Montreal, and the people with the lowest ping time — milliseconds matter — are going to be able to get it.
Unless you're sitting right next to that data center, you're not getting something like edge delivery for both of those scenarios. You actually could create a fair marketplace where when people put something out that is either super expensive or super scarce, they can run that auction and a set point in time on my infrastructure or infrastructure like mine, where all the locations are out at the edge. That would provide a much more equal playing field for people.