Enterprise

Taylor Swift tickets are hard to get. Edge computing could fix that.

The CEO of StackPath thinks it will be a few years before decentralized edge computing makes sense for more applications, but there are a few in-demand services that could benefit from that approach today.

A Taylor Swift concert in 2015

After pandemic restrictions lift, Taylor Swift concerts could be hard to get into. Might edge computing be just the ticket?

Photo: Chaz McGregor/Unsplash

StackPath CEO Kip Turco has an easy reference when asked about the potential benefits of edge computing platforms: Taylor Swift.

Whenever Swift returns to live performances as the pandemic abates, demand for tickets to her shows could be unprecedented. Hundreds of thousands of people will be competing for seats against bots and professional ticket buying operations sure to swarm Ticketmaster's services, some of which run in AWS regional data centers.

"Milliseconds matter," Turco told Protocol, when it comes to a situation like the release of Taylor Swift tickets. A fan request to purchase tickets will have to travel to that regional data center and back, and even the smartest minds in enterprise tech haven't figured out how to increase the speed of light.

But edge computing, or decentralized computing, could change the way businesses offering time-sensitive services build and deploy those apps. By moving a modest amount of computing power closer to end users and connecting it with a super-efficient network, Turco thinks StackPath's services could help level the playing field, especially as the underwhelming rollout of 5G wireless networks really starts to gain traction.

StackPath, which has raised nearly $400 million since it was founded in 2015 to build out its content-delivery network, thinks it will be a few years before the market is ready for this technology. But lots of companies are planning for a world in which computing resources start to move away from centralized cloud data centers and closer to the edge of the network, and Turco shared his thoughts on how this market will evolve in a recent interview.

This interview has been edited and condensed for clarity.

StackPath CEO Kip Turco. StackPath CEO Kip Turco sees edge computing as the future.Photo: StackPath

What's the most clear example of why edge computing is needed today?

We tend to agree that the potential from an edge compute marketplace, we really view as kind of like a 2024, '25, '26, '27 evolution. Some of that is associated with the timeline of things that we're doing internally, but more of it I think is based on the marketplace.
The drivers that will open up that marketplace are things like 5G; as you know, everyone's advertising 5G, but you know as well as I do [that] the train has left the station, but it's not even halfway there. So it's the evolution of 5G, it's the continued development of IoT products, and associated AI with those IoT products closer to the end user.

And then fourth, the only thing that I think really helped accelerate it this year, COVID; not COVID itself, but creating that diverse work environment, which has changed how everyone's thinking about applications in the network and their proximity to the end users.

What kind of computing resources can you expect at the edge?

We think about it in terms of, generally speaking, data in motion or data at rest. If you think of data at rest, that's like a typical primary [server], whether it's in a hyperscaler like Google, Microsoft, or Amazon, or your own core data center. That's where you're keeping critical company data.

Processing and storing the data in motion, which I believe is what we're helping to solve for our edge use case, is the data that needs to be quickly accessed, stored and analyzed and then dumped, that you don't need to send back.

Here in Brookline near Boston, there's not a straight street in the whole damn town. And you have all the weather conditions, you have all the different traffic that changes on an hourly basis, and then you roll something forward like an autonomous car. To drive around and make a determination like "Hey, there's an accident up here," or "Hey, it's gonna start to snow, we need to go left or right," applications like that for an autonomous car to get a reliable, millisecond-quick decision are the data sets that we see moving towards [the edge].

It sounds like most of those are not going to be sort of your typical corporate applications.

From my perspective, the main corporate applications that would need to reside out there are ones that are time- or latency-sensitive. So for example, one of the applications that we're working on to be deployed at the edge is a VDI [virtual desktop infrastructure] solution. And this company made a VDI investment, but has struggled having their employees use it. Now, we've taken that VDI and they've rewritten that application so it's a bit lighter. And we're deploying that application closer to their employees or end users, and to marketplaces, to see if it works better for them.

What kind of network infrastructure investment do you need to have in order to really support this?

I believe when you roll it all forward, the more critical component of it will be network than actual processing or storage. We were initially a network-driven CDN company, so for us, connectivity was the star in the choosing point for all our locations.
The hyperscalers are attacking edge compute going out from their huge centralized clouds closer to their end users. I feel like someone like us or Fastly are doing the same thing, but we're starting by virtue of what we did with CDN out of the internet edge. We're attacking the edge compute market from different directions.

We're not trying to be everywhere on the globe, we're trying to be where we think are the 50, to top 100 places that you need to be to have a premier global network. And from there, we'll distribute to the edge as our customers need us to or as they would want us to, instead of trying to build out 3,000 POPs [points of presence] globally all over the place and throw dots all over a map and hope we're in the right place.

So certainly you know the hyperscalers have invested a lot in networking capabilities. When you think about their private transit networks, they're some of the best in the world. That would appear to give them a fair amount of capacity and capability to be able to extend these types of services and ideas pretty far out onto the edge as well.

There's no doubt that they have expansive networking capabilities. But at a high level, I think a majority of those are from those hyperscaler pods to other massive [internet exchanges] or data centers. Where the network that we've built or someone like Fastly is built, our networks are more expansive from the end user: that TV sitting in the house, a mobile phone in your hand or the car that you're driving around out towards the internet edge.

I'm not trying to compete with Google, Amazon, or Microsoft. I am trying to coexist to make sure that our platform or edge POPs are interoperable with all of those different providers or possible competitors.

From an application architectural standpoint, is there a lot of work that needs to be done to sort of make those applications edge-sensitive?

I'd say it's a fair amount of work. A majority of the work revolves around decoupling all of their applications to really be written in environments where you have data at rest, like in a large mammoth centralized data center, to a lighter-weight version of that as data in motion that is pushed out from the centralized location, and just gradually updates the core app. I don't think it's super easy, but I don't think it's super complex.

I'm thinking about the companies who are rewriting applications now for cloud, so if you make that effort to do that for cloud, then you have to make an additional effort to do it for edge...

I don't think it's going to be nearly as hard as writing an application to the cloud.

What are people overlooking about edge and decentralized computing?

To me, it's like we're back 15 years ago doing the whole cloud thing, where it's moved to the market and it's about how we get there and when we get there. I think it's going to be similar in that we have an evolution in technology, which is going to enable folks to think about their applications differently, and run them more efficiently in the future.

Even something crazy simple; my son, who was huge into baseball. I remember [a few years ago], we'd be coming back from baseball practice and I'd be like, "Let's get McDonald's on the way home," which when I was a kid that's like, hands down, yes, I'm all over it. And he's like, "No, we've got to go home."

He's looking to jump on the Xbox to play Fortnite. The reason none of these kids play something like Fortnite on the mobile phone is because the experience sucks because the throughput is bad. As 5G opens up, that's going to push companies to rethink their delivery model differently for the economic benefit from it.

Two years ago, my daughters were like, "Taylor Swift is coming to Gillette [Stadium], we're going to all stay up and we need everyone's device in the home, you can't go out of the home because it's not quick enough to sign up to get Taylor Swift tickets at 12:01 a.m." And I'm sitting there thinking to myself we've got zero chance, because they're all being served out of the Amazon data center probably on the East Coast, either in New York or in Montreal, and the people with the lowest ping time — milliseconds matter — are going to be able to get it.

Unless you're sitting right next to that data center, you're not getting something like edge delivery for both of those scenarios. You actually could create a fair marketplace where when people put something out that is either super expensive or super scarce, they can run that auction and a set point in time on my infrastructure or infrastructure like mine, where all the locations are out at the edge. That would provide a much more equal playing field for people.

Climate

A pro-China disinformation campaign is targeting rare earth miners

It’s uncommon for cyber criminals to target private industry. But a new operation has cast doubt on miners looking to gain a foothold in the West in an apparent attempt to protect China’s upper hand in a market that has become increasingly vital.

It is very uncommon for coordinated disinformation operations to target private industry, rather than governments or civil society, a cybersecurity expert says.

Photo: Goh Seng Chong/Bloomberg via Getty Images

Just when we thought the renewable energy supply chains couldn’t get more fraught, a sophisticated disinformation campaign has taken to social media to further complicate things.

Known as Dragonbridge, the campaign has existed for at least three years, but in the last few months it has shifted its focus to target several mining companies “with negative messaging in response to potential or planned rare earths production activities.” It was initially uncovered by cybersecurity firm Mandiant and peddles narratives in the Chinese interest via its network of thousands of fake social media accounts.

Keep Reading Show less
Lisa Martine Jenkins

Lisa Martine Jenkins is a senior reporter at Protocol covering climate. Lisa previously wrote for Morning Consult, Chemical Watch and the Associated Press. Lisa is currently based in Brooklyn, and is originally from the Bay Area. Find her on Twitter ( @l_m_j_) or reach out via email (ljenkins@protocol.com).

Some of the most astounding tech-enabled advances of the next decade, from cutting-edge medical research to urban traffic control and factory floor optimization, will be enabled by a device often smaller than a thumbnail: the memory chip.

While vast amounts of data are created, stored and processed every moment — by some estimates, 2.5 quintillion bytes daily — the insights in that code are unlocked by the memory chips that hold it and transfer it. “Memory will propel the next 10 years into the most transformative years in human history,” said Sanjay Mehrotra, president and CEO of Micron Technology.

Keep Reading Show less
James Daly
James Daly has a deep knowledge of creating brand voice identity, including understanding various audiences and targeting messaging accordingly. He enjoys commissioning, editing, writing, and business development, particularly in launching new ventures and building passionate audiences. Daly has led teams large and small to multiple awards and quantifiable success through a strategy built on teamwork, passion, fact-checking, intelligence, analytics, and audience growth while meeting budget goals and production deadlines in fast-paced environments. Daly is the Editorial Director of 2030 Media and a contributor at Wired.
Fintech

Ripple’s CEO threatens to leave the US if it loses SEC case

CEO Brad Garlinghouse said a few countries have reached out to Ripple about relocating.

"There's no doubt that if the SEC doesn't win their case against us that that is good for crypto in the United States,” Brad Garlinghouse told Protocol.

Photo: Stephen McCarthy/Sportsfile for Collision via Getty Images

Ripple CEO Brad Garlinghouse said the crypto company will move to another country if it loses in its legal battle with the SEC.

Garlinghouse said he’s confident that Ripple will prevail against the federal regulator, which accused the company of failing to register roughly $1.4 billion in XRP tokens as securities.

Keep Reading Show less
Benjamin Pimentel

Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.

Policy

The Supreme Court’s EPA ruling is bad news for tech regulation, too

The justices just gave themselves a lot of discretion to smack down agency rules.

The ruling could also endanger work on competition issues by the FTC and net neutrality by the FCC.

Photo: Geoff Livingston/Getty Images

The Supreme Court’s decision last week gutting the Environmental Protection Agency’s ability to regulate greenhouse gas emissions didn’t just signal the conservative justices’ dislike of the Clean Air Act at a moment of climate crisis. It also served as a warning for anyone that would like to see more regulation of Big Tech.

At the heart of Chief Justice John Roberts’ decision in West Virginia v. EPA was a codification of the “major questions doctrine,” which, he wrote, requires “clear congressional authorization” when agencies want to regulate on areas of great “economic and political significance.”

Keep Reading Show less
Ben Brody

Ben Brody (@ BenBrodyDC) is a senior reporter at Protocol focusing on how Congress, courts and agencies affect the online world we live in. He formerly covered tech policy and lobbying (including antitrust, Section 230 and privacy) at Bloomberg News, where he previously reported on the influence industry, government ethics and the 2016 presidential election. Before that, Ben covered business news at CNNMoney and AdAge, and all manner of stories in and around New York. He still loves appearing on the New York news radio he grew up with.

Enterprise

Microsoft and Google are still using emotion AI, but with limits

Microsoft said accessibility goals overrode problems with emotion recognition and Google offers off-the-shelf emotion recognition technology amid growing concern over the controversial AI.

Emotion recognition is a well-established field of computer vision research; however, AI-based technologies used in an attempt to assess people’s emotional states have moved beyond the research phase.

Photo: Microsoft

Microsoft said last month it would no longer provide general use of an AI-based cloud software feature used to infer people’s emotions. However, despite its own admission that emotion recognition technology creates “risks,” it turns out the company will retain its emotion recognition capability in an app used by people with vision loss.

In fact, amid growing concerns over development and use of controversial emotion recognition in everyday software, both Microsoft and Google continue to incorporate the AI-based features in their products.

“The Seeing AI person channel enables you to recognize people and to get a description of them, including an estimate of their age and also their emotion,” said Saqib Shaikh, a software engineering manager and project lead for Seeing AI at Microsoft who helped build the app, in a tutorial about the product in a 2017 Microsoft video.

Keep Reading Show less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins