Snap is all in on AR: The Snapchat maker has been building its own AR glasses, and is currently testing an early version with a small group of creators. Snap has also signed up 250,000 creators to build mobile-centric AR experiences through its Lens Studio platform, whose lenses have collectively been viewed over 3.5 trillion times.
Snap celebrated those milestones at its Lens Fest Tuesday, which the company also used to release a number of updates for both mobile and headworn AR. Snap CTO Bobby Murphy recently put that work in context in an interview with Protocol, in which he talked about the company’s progress in building AR Spectacles, why it isn’t focused on non-AR wearables anymore and why it ultimately also wants to build apps and experiences for AR devices made by its competitors.
This interview has been edited and condensed for clarity.
It’s been about half a year since the launch of the new Spectacles with AR. What have you learned from having developers work with that device?
The first thing is just the sheer excitement and passion among our early community of creators. It's just amazing to see how many people are really loving the opportunity to develop AR experiences in a way that aligns with how we all envision the future, even if it is a decade away.
But we're also learning a lot. We are testing the design of inputs, and navigation of experiences in a non-mobile format. All of the work that we're seeing there, all of the imagination and creativity is helping us design around those issues.
These AR Spectacles are only available on an invite-only basis right now. Will you keep making non-AR consumer versions of Spectacles?
For us, the objective of the earlier iterations of Spectacles was to learn from the experience of having any kind of technology on your face, even something as basic as just a camera. We got a ton of insights [on how to] design that thoughtfully, and learn from our community [about] the ways in which people are willing or unwilling to wear glasses when they go about their day, and the use cases and the behavior that would accompany having a pair of glasses with some camera technology on it.
Our main priority going forward is to continue to develop and expand on the AR capabilities of the Spectacles that were announced this year. We'll continue to expand on that, but obviously bring in a lot of the learning that we've had over these last several years with prior versions as well.
A number of other companies are working on consumer AR glasses as well, including Apple and Facebook. Do you eventually want to get to a point where Snap’s software runs on those third-party devices as well?
We've always tried to be as hardware-agnostic as we can be, and leverage the best of what any device has to offer, [even] on our mobile platform. If you look at what we're doing around Camera Kit: Samsung has a carousel of lenses, enabled by Camera Kit, in the native camera of their A-series of [phones]. They are leveraging our technology, our tools, our lens creator and developer community.
We will continue to look for opportunities [like these] to work with any company who is doing innovative work in the space. Whether we are building our own hardware or operating our software on other companies’ hardware, we're going to empower the best form of AR experiences that we can.
Ultimately, you want to do both, right? Build AR glasses and also run on third-party AR hardware?
We already are, through our mobile platform. We are an application, Snapchat, that is built on Apple and Google hardware and Apple and Google platforms. Through Camera Kit, we're powering AR experiences on other applications. And we are also investing in our own hardware, trying to learn and understand what will work best in a new form factor.
How long will it take until we get to mainstream consumer AR glasses? And what are the biggest roadblocks right now?
What's so fascinating about development in this space is that we're balancing a ton of constraints. If we can better understand how these devices are used, which we're doing now through early iterations of Spectacles, we can start to appreciate the experiences that need to be powered, the duration we expect of a lens experience or the capabilities that are really going to create the most engaging experiences. Through all of that learning, we can start to iteratively get to the right sweet spot for this technology. But the biggest [constraints] are probably around battery, compute power and the display of [these] devices.
In terms of timing, I think it will happen more incrementally than people may appreciate. We certainly expect that almost everybody will get value out of wearing a pair of Spectacles or a wearable AR device some day, but that will still be a while.
Until then, we'll continue to develop Spectacles [iteratively]. Today, the battery only works for half an hour. Maybe sometime in the not too distant future, it works for an hour, or two or three hours. Then, you start to come up with places and opportunities and experiences that can fit the constraints of the device. We expect that there will be a lot of opportunities to allow more and more people to engage with AR even in kind of shorter or more controlled experiences well before we get to a truly consumer-ready, always-on, lightweight AR experience.
What could those experiences that give people a glimpse of an AR future look like?
One area that we're excited about is around events. We've already done some really spectacular AR experiences in many different places. We have one down in Art Basel with [the artist] Alex Israel. In his exhibit, you can walk in and use your phone to scan and trigger an amazingly vivid and rich experience. That's something that could be empowered through wearable AR. Maybe you put on a pair of Spectacles, spend 30 minutes walking through an exhibit and then you take it off. Things like that will become more and more interesting and feasible over time.