Snap CTO Bobby Murphy on embracing Apple’s AR glasses

Snap is building its own AR Spectacles, but the company also wants to embrace third-party devices.

Chief Technology Officer at Snap Inc. Bobby Murphy

Bobby Murphy wants Snap’s AR lenses to run everywhere — even on hardware made by competitors.

Photo: Getty Images for Snap Inc

Snap is all in on AR: The Snapchat maker has been building its own AR glasses, and is currently testing an early version with a small group of creators. Snap has also signed up 250,000 creators to build mobile-centric AR experiences through its Lens Studio platform, whose lenses have collectively been viewed over 3.5 trillion times.

Snap celebrated those milestones at its Lens Fest Tuesday, which the company also used to release a number of updates for both mobile and headworn AR. Snap CTO Bobby Murphy recently put that work in context in an interview with Protocol, in which he talked about the company’s progress in building AR Spectacles, why it isn’t focused on non-AR wearables anymore and why it ultimately also wants to build apps and experiences for AR devices made by its competitors.

This interview has been edited and condensed for clarity.

It’s been about half a year since the launch of the new Spectacles with AR. What have you learned from having developers work with that device?

The first thing is just the sheer excitement and passion among our early community of creators. It's just amazing to see how many people are really loving the opportunity to develop AR experiences in a way that aligns with how we all envision the future, even if it is a decade away.

But we're also learning a lot. We are testing the design of inputs, and navigation of experiences in a non-mobile format. All of the work that we're seeing there, all of the imagination and creativity is helping us design around those issues.

These AR Spectacles are only available on an invite-only basis right now. Will you keep making non-AR consumer versions of Spectacles?

For us, the objective of the earlier iterations of Spectacles was to learn from the experience of having any kind of technology on your face, even something as basic as just a camera. We got a ton of insights [on how to] design that thoughtfully, and learn from our community [about] the ways in which people are willing or unwilling to wear glasses when they go about their day, and the use cases and the behavior that would accompany having a pair of glasses with some camera technology on it.

Our main priority going forward is to continue to develop and expand on the AR capabilities of the Spectacles that were announced this year. We'll continue to expand on that, but obviously bring in a lot of the learning that we've had over these last several years with prior versions as well.

A number of other companies are working on consumer AR glasses as well, including Apple and Facebook. Do you eventually want to get to a point where Snap’s software runs on those third-party devices as well?

We've always tried to be as hardware-agnostic as we can be, and leverage the best of what any device has to offer, [even] on our mobile platform. If you look at what we're doing around Camera Kit: Samsung has a carousel of lenses, enabled by Camera Kit, in the native camera of their A-series of [phones]. They are leveraging our technology, our tools, our lens creator and developer community.

We will continue to look for opportunities [like these] to work with any company who is doing innovative work in the space. Whether we are building our own hardware or operating our software on other companies’ hardware, we're going to empower the best form of AR experiences that we can.

Ultimately, you want to do both, right? Build AR glasses and also run on third-party AR hardware?

We already are, through our mobile platform. We are an application, Snapchat, that is built on Apple and Google hardware and Apple and Google platforms. Through Camera Kit, we're powering AR experiences on other applications. And we are also investing in our own hardware, trying to learn and understand what will work best in a new form factor.

How long will it take until we get to mainstream consumer AR glasses? And what are the biggest roadblocks right now?

What's so fascinating about development in this space is that we're balancing a ton of constraints. If we can better understand how these devices are used, which we're doing now through early iterations of Spectacles, we can start to appreciate the experiences that need to be powered, the duration we expect of a lens experience or the capabilities that are really going to create the most engaging experiences. Through all of that learning, we can start to iteratively get to the right sweet spot for this technology. But the biggest [constraints] are probably around battery, compute power and the display of [these] devices.

In terms of timing, I think it will happen more incrementally than people may appreciate. We certainly expect that almost everybody will get value out of wearing a pair of Spectacles or a wearable AR device some day, but that will still be a while.

Until then, we'll continue to develop Spectacles [iteratively]. Today, the battery only works for half an hour. Maybe sometime in the not too distant future, it works for an hour, or two or three hours. Then, you start to come up with places and opportunities and experiences that can fit the constraints of the device. We expect that there will be a lot of opportunities to allow more and more people to engage with AR even in kind of shorter or more controlled experiences well before we get to a truly consumer-ready, always-on, lightweight AR experience.

What could those experiences that give people a glimpse of an AR future look like?

One area that we're excited about is around events. We've already done some really spectacular AR experiences in many different places. We have one down in Art Basel with [the artist] Alex Israel. In his exhibit, you can walk in and use your phone to scan and trigger an amazingly vivid and rich experience. That's something that could be empowered through wearable AR. Maybe you put on a pair of Spectacles, spend 30 minutes walking through an exhibit and then you take it off. Things like that will become more and more interesting and feasible over time.


Why foundation models in AI need to be released responsibly

Foundation models like GPT-3 and DALL-E are changing AI forever. We urgently need to develop community norms that guarantee research access and help guide the future of AI responsibly.

Releasing new foundation models doesn’t have to be an all or nothing proposition.

Illustration: sorbetto/DigitalVision Vectors

Percy Liang is director of the Center for Research on Foundation Models, a faculty affiliate at the Stanford Institute for Human-Centered AI and an associate professor of Computer Science at Stanford University.

Humans are not very good at forecasting the future, especially when it comes to technology.

Keep Reading Show less
Percy Liang
Percy Liang is Director of the Center for Research on Foundation Models, a Faculty Affiliate at the Stanford Institute for Human-Centered AI, and an Associate Professor of Computer Science at Stanford University.

Every day, millions of us press the “order” button on our favorite coffee store's mobile application: Our chosen brew will be on the counter when we arrive. It’s a personalized, seamless experience that we have all come to expect. What we don’t know is what’s happening behind the scenes. The mobile application is sourcing data from a database that stores information about each customer and what their favorite coffee drinks are. It is also leveraging event-streaming data in real time to ensure the ingredients for your personal coffee are in supply at your local store.

Applications like this power our daily lives, and if they can’t access massive amounts of data stored in a database as well as stream data “in motion” instantaneously, you — and millions of customers — won’t have these in-the-moment experiences.

Keep Reading Show less
Jennifer Goforth Gregory
Jennifer Goforth Gregory has worked in the B2B technology industry for over 20 years. As a freelance writer she writes for top technology brands, including IBM, HPE, Adobe, AT&T, Verizon, Epson, Oracle, Intel and Square. She specializes in a wide range of technology, such as AI, IoT, cloud, cybersecurity, and CX. Jennifer also wrote a bestselling book The Freelance Content Marketing Writer to help other writers launch a high earning freelance business.

The West’s drought could bring about a data center reckoning

When it comes to water use, data centers are the tech industry’s secret water hogs — and they could soon come under increased scrutiny.

Lake Mead, North America's largest artificial reservoir, has dropped to about 1,052 feet above sea level, the lowest it's been since being filled in 1937.

Photo: Mario Tama/Getty Images

The West is parched, and getting more so by the day. Lake Mead — the country’s largest reservoir — is nearing “dead pool” levels, meaning it may soon be too low to flow downstream. The entirety of the Four Corners plus California is mired in megadrought.

Amid this desiccation, hundreds of the country’s data centers use vast amounts of water to hum along. Dozens cluster around major metro centers, including those with mandatory or voluntary water restrictions in place to curtail residential and agricultural use.

Keep Reading Show less
Lisa Martine Jenkins

Lisa Martine Jenkins is a senior reporter at Protocol covering climate. Lisa previously wrote for Morning Consult, Chemical Watch and the Associated Press. Lisa is currently based in Brooklyn, and is originally from the Bay Area. Find her on Twitter ( @l_m_j_) or reach out via email (


Indeed is hiring 4,000 workers despite industry layoffs

Indeed’s new CPO, Priscilla Koranteng, spoke to Protocol about her first 100 days in the role and the changing nature of HR.

"[Y]ou are serving the people. And everything that's happening around us in the world is … impacting their professional lives."

Image: Protocol

Priscilla Koranteng's plans are ambitious. Koranteng, who was appointed chief people officer of Indeed in June, has already enhanced the company’s abortion travel policies and reinforced its goal to hire 4,000 people in 2022.

She’s joined the HR tech company in a time when many other tech companies are enacting layoffs and cutbacks, but said she sees this precarious time as an opportunity for growth companies to really get ahead. Koranteng, who comes from an HR and diversity VP role at Kellogg, is working on embedding her hybrid set of expertise in her new role at Indeed.

Keep Reading Show less
Amber Burton

Amber Burton (@amberbburton) is a reporter at Protocol. Previously, she covered personal finance and diversity in business at The Wall Street Journal. She earned an M.S. in Strategic Communications from Columbia University and B.A. in English and Journalism from Wake Forest University. She lives in North Carolina.


New Jersey could become an ocean energy hub

A first-in-the-nation bill would support wave and tidal energy as a way to meet the Garden State's climate goals.

Technological challenges mean wave and tidal power remain generally more expensive than their other renewable counterparts. But government support could help spur more innovation that brings down cost.

Photo: Jeremy Bishop via Unsplash

Move over, solar and wind. There’s a new kid on the renewable energy block: waves and tides.

Harnessing the ocean’s power is still in its early stages, but the industry is poised for a big legislative boost, with the potential for real investment down the line.

Keep Reading Show less
Lisa Martine Jenkins

Lisa Martine Jenkins is a senior reporter at Protocol covering climate. Lisa previously wrote for Morning Consult, Chemical Watch and the Associated Press. Lisa is currently based in Brooklyn, and is originally from the Bay Area. Find her on Twitter ( @l_m_j_) or reach out via email (

Latest Stories