The next big thing in VR may be your living room furniture: Meta is releasing a new SDK for its Quest VR headset next week that will make it easier for developers to incorporate real-world surroundings into VR apps and games.
The release marks a major step toward bringing mixed reality experiences to Meta’s VR headsets, which has the potential to make VR feel more real. It also foreshadows a world in which headsets make sense of the world around us, blurring the lines between AR and VR.
At the same time, Meta’s embrace of mixed reality tells us a lot about the company’s take on immersive computing, which includes long-term commitments to both AR and VR. “A big part of our bet has been that these are two sides of the coin,” said Mark Zuckerberg in an exclusive conversation with Protocol about the company’s mixed reality efforts.
Combining the real world with VR
The new Quest SDK, which the company officially announced Thursday, gives developers access to something it calls Presence Platform — a set of tools meant to make VR feel more natural. That includes voice-hand interaction as well as a number of features related to video passthrough.
The company’s Quest VR headset already includes ways for users to tap into a grayscale live view of their surroundings, captured by the headset’s tracking cameras. Now, developers can use the same video feed and superimpose animated VR objects over it. The headset will also allow people to map their room and tell VR apps where walls, tables, couches and other objects are. Once a room is mapped this way, VR apps can incorporate these real-world surfaces into gameplay and other interactions.
First demos of this type of mixed reality show a lot of potential: Resolution Games uses passthrough to let people open up a virtual hole in their living room floor, revealing a virtual lake that they can fish in. The VR drawing app Gravity Sketch makes it possible to place VR objects in the real world, and, for instance, see how a model of a chair would look in your living room.
Meta also built its own demo app, called The World Beyond, to demonstrate some of the potential of mixed reality on the Quest. In it, players get to interact with a cute little monster that likes to play fetch. Throw a virtual ball, and it bounces off your real-world walls or furniture. Blast those same walls with a special ray, and they turn into portals to a colorful animated world. “You get that grounding in the real world, but you are in a virtual experience,” said Reality Labs product manager Prabhu Parthasarathy.
The company is releasing The World Beyond to the public next week, and will also make the source code available to developers.
From tracking to mixed reality and beyond
Meta’s path to mixed reality almost began by accident. When the company switched from a PC VR architecture that used external trackers to inside-out tracking on the Quest, it used what are effectively camera sensors to keep track of both VR controllers and the position of the player in the room. “We basically built the sensors just for tracking,” Zuckerberg said. “Then, an engineer had the idea: ‘Let's see if we can turn this on when you start moving outside of your boundary.’”
When the first Quest shipped in 2019, the camera sensors were starting to have some double duties. In addition to tracking, they were also used to provide a grayscale view of the outside world that helped Quest owners map out their play space to keep them away from walls and tables. Once in VR, the same borders would come up as a guardian, and moving beyond the guardian resulted in the headset switching from VR to a passthrough video view of the real world. “We originally just made it a safety feature,” Zuckerberg said.
Over time, Meta expanded on those safety features, adding the ability to display passthrough video of people, pets and objects entering the play space. Then, the company began to experiment with more directly combining virtual and real objects, which includes mapping of real-world spaces as well as spatial anchors that add persistence to mixed reality.
One of the first VR experiences to incorporate some of these technologies has been Horizon Workrooms, the company’s VR collaboration software. The Workrooms app allows people to map their desk, incorporate a real laptop into an avatar-based VR meeting and anchor a virtual whiteboard in real-world surroundings. “We initially started working on it almost as a demo, bringing a lot of technologies together,” Zuckerberg said. Over time, it became clear that Workrooms could be more than just a tech demo, and the company began making it more widely available.
Incorporating passthrough video is just a first step toward a more immersive mixed reality future. Just as important will be object recognition; Meta has not revealed any concrete plans for bringing object recognition to future headsets, but it could represent a massive shift for the technology.
Right now, a Quest headset only knows that you are sitting in front of a desk if you map and then label that desk. In the future, headsets might be able to recognize both the desk as well as the objects on the desk, to seamlessly incorporate everything into mixed reality experiences.
Getting there won’t be easy, though, cautioned Parthasarathy. “Object recognition is a very complex engineering problem,” he said. “I don't expect anybody to have ... this solved anytime soon, including us.” However, the potential upside is massive, he agreed. “Object understanding is going to be a fundamental shift in how we use headsets, whether it's VR or AR,” Parthasarathy said.
Betting on VR, even if others don’t
At launch, Meta’s take on mixed reality is constrained by the lack of video fidelity of the Quest’s tracking sensors. Developers can make up for some of that by shifting the focus to higher-fidelity VR objects. “When you're in the middle of those experiences, black and white kind of disappears on you,” Parthasarathy said. “We think there's a lot of opportunity to build mixed reality, even on Quest.”
Things will only improve with future headsets with better image sensors, including the work-focused Project Cambria device Meta plans to release later this year. However, incorporating mixed reality now is just as much about the future of visual computing, which also includes augmented reality. By allowing developers to overlay virtual objects over the real world, they can get an early taste for the potential of future AR hardware, which also could make Meta’s headsets more appealing to AR purists.
“There is a cohort of people who think that AR is going to be a big thing, but who don't yet believe that much in VR,” Zuckerberg said. “It’s pretty exciting to get that whole group of people into the fold.” VR skepticism isn’t just a phenomenon among developers; companies like Magic Leap have decided to focus on AR over VR, even if it means producing devices that are expensive and not consumer-ready. Zuckerberg argued that this approach ignores existing consumer habits.
“In the tech industry, we have this bias where we think phones are really critical and TVs are not,” Zuckerberg said. “But the average American spends as much or more time on a TV than a phone.” Ultimately, VR headsets could become a kind of immersive version of the TV set, while AR glasses could become a future mobile device, which is why Meta is investing in both, he said.
“Being able to ship VR devices that gain some mainstream adoption and building a developer community allows us to build out the toolchain, a good community and monetization for developers even before [glasses-like] AR devices show up,” Zuckerberg said.
Mixed reality tools that allow developers to play in both worlds will enable them to get ready for that future. “That's [the] bet we're making,” he said. “I could be wrong. But so far, the data suggests that VR is going to be quite important. So I'm feeling pretty good about the path that we're on.”