How Snap made its old stack disappear
Photo: Souvik Banerjee/Unsplash

How Snap made its old stack disappear

Protocol Enterprise

Hello, and welcome to Protocol Enterprise! Today: how Snap introduces microservices and a multicloud approach after outgrowing its launch infrastructure, why a national AI strategy isn’t likely to come together any time soon and how AWS is trying to be “Earth’s best employer.”

Microservices, maximum effect

It sounds like a name for a death metal band, but Annihilate FSN is the project undertaken by Snap, the owner of the Snapchat social media messaging app, to rebuild its technology infrastructure to accommodate its growth.

Snap was built on Google App Engine, a serverless platform for developing and hosting web applications.

  • In 2017, Snap was running 95% of its infrastructure on GAE.
  • But that monolithic back-end system — known internally as FSN, or Feelin-So-Nice — started to run into issues tied to the company’s increasing scale.
  • “We had just continued to add more and more back-end services to GAE, and they would merge into a whole bunch of code bases that were all operating in the same big app instance,” Jerry Hunter, Snap’s senior vice president of engineering, told Protocol.
  • “Google App Engine wasn't really designed to support really big implementations,” said Hunter, who joined Snap in late 2016 after serving as AWS’ vice president of infrastructure.

Under Annihilate FSN, Snap started breaking out its back end into microservices that were backed by other services inside of Google Cloud.

  • Shortly thereafter, it added AWS as its second cloud computing provider.
  • Less than 1.5% of Snap’s infrastructure now sits on GAE, and Snap picks and chooses which workloads to place on AWS or Google Cloud Platform under its multicloud model.
  • Snap recognized that microservices would provide a lot more reliability and control, especially from a cost and performance perspective.
  • Its new architecture has led to a 65% reduction in compute costs, according to the company.

There’s great value in having cloud competitors in Snap’s supply chain, according to Hunter.

  • Snap is able to use the competition between AWS and Google Cloud to its benefit.
  • “We kind of say, 'Well, you know, your performance isn't good enough on this service' or 'This is too expensive for us — maybe there's something you could do to bring the price down,'” Hunter said.
  • “I just believe that providers work better when they've got real competition,” he said. “You just get better … pricing, better features, better service … We save a lot of money by having two clouds.”

Read the full story here.

— Donna Goodison (email | twitter)

Sponsored content from DataRobot

DataRobot's AI Cloud for Financial Services Unlocks the Art of the Possible: DataRobot continues to attract clients in financial services who want to de-risk their AI investments and rapidly scale AI to almost every part of their operations, resulting in improved productivity and higher customer satisfaction.

Read more from DataRobot

Another federal AI vacuum?

Is the U.S. going to adopt an AI strategy or nationwide AI guidelines anytime soon?

It’s never a positive sign of progress when a key leader takes off before a big project is wrapped up. In this case, the red flag comes with the departure of Lynne Parker, the deputy chief technology officer of the U.S. and director of the National Artificial Intelligence Initiative Office, a division of the White House Office of Science and Technology Policy that oversees development and coordination of federal AI activities and strategy.

Parker will return to the University of Tennessee, Knoxville to serve as associate vice chancellor and director of the new AI Tennessee Initiative at UT. Perhaps she really misses academia, but Parker leaves what she referred to in a LinkedIn post as her “6-year service in the U.S. Government” before a few big AI policy efforts come to fruition.

Parker has been co-leading development of the National AI Research Resource, a would-be government-funded AI research hub that would provide data for training machine-learning models along with tools and computing power likely to come from the three Big Cloud companies — AWS, Microsoft Azure and Google Cloud. The final proposal for that project is expected at the end of this year.

There’s also a revision to the National AI R&D strategic plan in the works, expected by the end of the year.

And there’s another policy plan percolating at OSTP: a much-anticipated AI Bill of Rights. That has yet to see the light of day almost a year after the OSTP announced its mission to produce one in a splashy Wired opinion piece, co-authored by Eric Lander, the former OSTP director who resigned in February amid accusations he created “an atmosphere of intimidation at OSTP through flagrant verbal abuse.” Later in March, POLITICO revealed that Lander had been a key link at OSTP in allowing an organization led by former Google CEO Eric Schmidt to pay the salaries of some OSTP staff.

Ultimately, Parker’s departure may merely stall ongoing work, but leadership setbacks help widen the vacuum for private sector groups aiming to drive AI policy — like Schmidt’s shadowy new Special Competitive Studies Project and corporate-led Partnership on AI — to fill.

— Kate Kaye (email| twitter)

Mars can wait

Amazon’s high-performance work culture sometimes draws criticism, a point that host Bob Safian brought up with AWS CEO Adam Selipsky in a “Masters of Scale” podcast released yesterday.

Amazon last year announced two additions to its leadership principles that guide its work, including “strive to be Earth's best employer.” Those additions came the same month that Selipsky took the helm of AWS and his predecessor, Andy Jassy, became Amazon’s CEO. Selipsky referred to the principles as the “nervous system or even the operating system” of Amazon.

“I don't know at what point you actually declare victory and say you are ‘Earth's best employer,’ but it's indicative of the fact that we want to do better and better and continue to improve,” Selipsky told Safian.

To that end, Amazon identified 100 “paper cuts in our people processes,” he said. They include issues such as what happens to employees’ stock compensation when they go on medical leave, the company’s leave of absence process and how it does IT support for employees — “things which are, in some cases, merely annoying and other places just unpleasant for employees,” according to Selipsky.

“And we're methodically fixing those,” Selipsky said, noting substantial progress on 28 of the 100 at last check.

But it's not enough just to eliminate those pain points: When unhappy employees leave Amazon, they disproportionately say it’s because of their managers, Selipsky said.

“We want to make sure that we're a place where that's not the case,” he said. “And so we think about some really important, broad long-term initiatives such as investing in managers. I personally think a lot about ideas like listening with empathy, like making sure that both I as well as other leaders are setting a culture where we listen to people and … understand where they're coming from.”

— Donna Goodison (email | twitter)

Around the enterprise

AWS is working on a challenger to the data-sharing features offered by data hotshots like Snowflake and Databricks, according to The Information.

After the passage of the Chips Act, chipmakers seem emboldened to ask for even more: Micron wants tax breaks from the state of Texas to build a manufacturing facility there.

Sponsored content from DataRobot

DataRobot's AI Cloud for Financial Services Unlocks the Art of the Possible: Banks need to secure a competitive advantage in an increasingly tight race to harness best-in-breed technology. Decision makers need to not just plan a future-ready strategy, but also recognize the value of AI that could boost not just their performance in-house but also their reputation among their global customers.

Read more from DataRobot

Thanks for reading — see you Monday!

Recent Issues