Entertainment

Exclusive: Mixed reality on Meta’s Quest offers a glimpse at the future of visual computing

By incorporating a view of the real world in VR experiences, Meta is getting ready for smarter headsets and ultimately AR glasses.

The next big thing in VR may be your living room furniture: Meta is releasing a new SDK for its Quest VR headset next week that will make it easier for developers to incorporate real-world surroundings into VR apps and games.

The release marks a major step toward bringing mixed reality experiences to Meta’s VR headsets, which has the potential to make VR feel more real. It also foreshadows a world in which headsets make sense of the world around us, blurring the lines between AR and VR.

At the same time, Meta’s embrace of mixed reality tells us a lot about the company’s take on immersive computing, which includes long-term commitments to both AR and VR. “A big part of our bet has been that these are two sides of the coin,” said Mark Zuckerberg in an exclusive conversation with Protocol about the company’s mixed reality efforts.

Combining the real world with VR

The new Quest SDK, which the company officially announced Thursday, gives developers access to something it calls Presence Platform — a set of tools meant to make VR feel more natural. That includes voice-hand interaction as well as a number of features related to video passthrough.

The company’s Quest VR headset already includes ways for users to tap into a grayscale live view of their surroundings, captured by the headset’s tracking cameras. Now, developers can use the same video feed and superimpose animated VR objects over it. The headset will also allow people to map their room and tell VR apps where walls, tables, couches and other objects are. Once a room is mapped this way, VR apps can incorporate these real-world surfaces into gameplay and other interactions.

First demos of this type of mixed reality show a lot of potential: Resolution Games uses passthrough to let people open up a virtual hole in their living room floor, revealing a virtual lake that they can fish in. The VR drawing app Gravity Sketch makes it possible to place VR objects in the real world, and, for instance, see how a model of a chair would look in your living room.

Meta also built its own demo app, called The World Beyond, to demonstrate some of the potential of mixed reality on the Quest. In it, players get to interact with a cute little monster that likes to play fetch. Throw a virtual ball, and it bounces off your real-world walls or furniture. Blast those same walls with a special ray, and they turn into portals to a colorful animated world. “You get that grounding in the real world, but you are in a virtual experience,” said Reality Labs product manager Prabhu Parthasarathy.

The company is releasing The World Beyond to the public next week, and will also make the source code available to developers.

From tracking to mixed reality and beyond

Meta’s path to mixed reality almost began by accident. When the company switched from a PC VR architecture that used external trackers to inside-out tracking on the Quest, it used what are effectively camera sensors to keep track of both VR controllers and the position of the player in the room. “We basically built the sensors just for tracking,” Zuckerberg said. “Then, an engineer had the idea: ‘Let's see if we can turn this on when you start moving outside of your boundary.’”

When the first Quest shipped in 2019, the camera sensors were starting to have some double duties. In addition to tracking, they were also used to provide a grayscale view of the outside world that helped Quest owners map out their play space to keep them away from walls and tables. Once in VR, the same borders would come up as a guardian, and moving beyond the guardian resulted in the headset switching from VR to a passthrough video view of the real world. “We originally just made it a safety feature,” Zuckerberg said.

Over time, Meta expanded on those safety features, adding the ability to display passthrough video of people, pets and objects entering the play space. Then, the company began to experiment with more directly combining virtual and real objects, which includes mapping of real-world spaces as well as spatial anchors that add persistence to mixed reality.

One of the first VR experiences to incorporate some of these technologies has been Horizon Workrooms, the company’s VR collaboration software. The Workrooms app allows people to map their desk, incorporate a real laptop into an avatar-based VR meeting and anchor a virtual whiteboard in real-world surroundings. “We initially started working on it almost as a demo, bringing a lot of technologies together,” Zuckerberg said. Over time, it became clear that Workrooms could be more than just a tech demo, and the company began making it more widely available.

Incorporating passthrough video is just a first step toward a more immersive mixed reality future. Just as important will be object recognition; Meta has not revealed any concrete plans for bringing object recognition to future headsets, but it could represent a massive shift for the technology.

Right now, a Quest headset only knows that you are sitting in front of a desk if you map and then label that desk. In the future, headsets might be able to recognize both the desk as well as the objects on the desk, to seamlessly incorporate everything into mixed reality experiences.

Getting there won’t be easy, though, cautioned Parthasarathy. “Object recognition is a very complex engineering problem,” he said. “I don't expect anybody to have ... this solved anytime soon, including us.” However, the potential upside is massive, he agreed. “Object understanding is going to be a fundamental shift in how we use headsets, whether it's VR or AR,” Parthasarathy said.

Betting on VR, even if others don’t

At launch, Meta’s take on mixed reality is constrained by the lack of video fidelity of the Quest’s tracking sensors. Developers can make up for some of that by shifting the focus to higher-fidelity VR objects. “When you're in the middle of those experiences, black and white kind of disappears on you,” Parthasarathy said. “We think there's a lot of opportunity to build mixed reality, even on Quest.”

Things will only improve with future headsets with better image sensors, including the work-focused Project Cambria device Meta plans to release later this year. However, incorporating mixed reality now is just as much about the future of visual computing, which also includes augmented reality. By allowing developers to overlay virtual objects over the real world, they can get an early taste for the potential of future AR hardware, which also could make Meta’s headsets more appealing to AR purists.

“There is a cohort of people who think that AR is going to be a big thing, but who don't yet believe that much in VR,” Zuckerberg said. “It’s pretty exciting to get that whole group of people into the fold.” VR skepticism isn’t just a phenomenon among developers; companies like Magic Leap have decided to focus on AR over VR, even if it means producing devices that are expensive and not consumer-ready. Zuckerberg argued that this approach ignores existing consumer habits.

“In the tech industry, we have this bias where we think phones are really critical and TVs are not,” Zuckerberg said. “But the average American spends as much or more time on a TV than a phone.” Ultimately, VR headsets could become a kind of immersive version of the TV set, while AR glasses could become a future mobile device, which is why Meta is investing in both, he said.

“Being able to ship VR devices that gain some mainstream adoption and building a developer community allows us to build out the toolchain, a good community and monetization for developers even before [glasses-like] AR devices show up,” Zuckerberg said.

Mixed reality tools that allow developers to play in both worlds will enable them to get ready for that future. “That's [the] bet we're making,” he said. “I could be wrong. But so far, the data suggests that VR is going to be quite important. So I'm feeling pretty good about the path that we're on.”

Fintech

Judge Zia Faruqui is trying to teach you crypto, one ‘SNL’ reference at a time

His decisions on major cryptocurrency cases have quoted "The Big Lebowski," "SNL," and "Dr. Strangelove." That’s because he wants you — yes, you — to read them.

The ways Zia Faruqui (right) has weighed on cases that have come before him can give lawyers clues as to what legal frameworks will pass muster.

Photo: Carolyn Van Houten/The Washington Post via Getty Images

“Cryptocurrency and related software analytics tools are ‘The wave of the future, Dude. One hundred percent electronic.’”

That’s not a quote from "The Big Lebowski" — at least, not directly. It’s a quote from a Washington, D.C., district court memorandum opinion on the role cryptocurrency analytics tools can play in government investigations. The author is Magistrate Judge Zia Faruqui.

Keep Reading Show less
Veronica Irwin

Veronica Irwin (@vronirwin) is a San Francisco-based reporter at Protocol covering fintech. Previously she was at the San Francisco Examiner, covering tech from a hyper-local angle. Before that, her byline was featured in SF Weekly, The Nation, Techworker, Ms. Magazine and The Frisc.

The financial technology transformation is driving competition, creating consumer choice, and shaping the future of finance. Hear from seven fintech leaders who are reshaping the future of finance, and join the inaugural Financial Technology Association Fintech Summit to learn more.

Keep Reading Show less
FTA
The Financial Technology Association (FTA) represents industry leaders shaping the future of finance. We champion the power of technology-centered financial services and advocate for the modernization of financial regulation to support inclusion and responsible innovation.
Enterprise

AWS CEO: The cloud isn’t just about technology

As AWS preps for its annual re:Invent conference, Adam Selipsky talks product strategy, support for hybrid environments, and the value of the cloud in uncertain economic times.

Photo: Noah Berger/Getty Images for Amazon Web Services

AWS is gearing up for re:Invent, its annual cloud computing conference where announcements this year are expected to focus on its end-to-end data strategy and delivering new industry-specific services.

It will be the second re:Invent with CEO Adam Selipsky as leader of the industry’s largest cloud provider after his return last year to AWS from data visualization company Tableau Software.

Keep Reading Show less
Donna Goodison

Donna Goodison (@dgoodison) is Protocol's senior reporter focusing on enterprise infrastructure technology, from the 'Big 3' cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.

Image: Protocol

We launched Protocol in February 2020 to cover the evolving power center of tech. It is with deep sadness that just under three years later, we are winding down the publication.

As of today, we will not publish any more stories. All of our newsletters, apart from our flagship, Source Code, will no longer be sent. Source Code will be published and sent for the next few weeks, but it will also close down in December.

Keep Reading Show less
Bennett Richardson

Bennett Richardson ( @bennettrich) is the president of Protocol. Prior to joining Protocol in 2019, Bennett was executive director of global strategic partnerships at POLITICO, where he led strategic growth efforts including POLITICO's European expansion in Brussels and POLITICO's creative agency POLITICO Focus during his six years with the company. Prior to POLITICO, Bennett was co-founder and CMO of Hinge, the mobile dating company recently acquired by Match Group. Bennett began his career in digital and social brand marketing working with major brands across tech, energy, and health care at leading marketing and communications agencies including Edelman and GMMB. Bennett is originally from Portland, Maine, and received his bachelor's degree from Colgate University.

Enterprise

Why large enterprises struggle to find suitable platforms for MLops

As companies expand their use of AI beyond running just a few machine learning models, and as larger enterprises go from deploying hundreds of models to thousands and even millions of models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

As companies expand their use of AI beyond running just a few machine learning models, ML practitioners say that they have yet to find what they need from prepackaged MLops systems.

Photo: artpartner-images via Getty Images

On any given day, Lily AI runs hundreds of machine learning models using computer vision and natural language processing that are customized for its retail and ecommerce clients to make website product recommendations, forecast demand, and plan merchandising. But this spring when the company was in the market for a machine learning operations platform to manage its expanding model roster, it wasn’t easy to find a suitable off-the-shelf system that could handle such a large number of models in deployment while also meeting other criteria.

Some MLops platforms are not well-suited for maintaining even more than 10 machine learning models when it comes to keeping track of data, navigating their user interfaces, or reporting capabilities, Matthew Nokleby, machine learning manager for Lily AI’s product intelligence team, told Protocol earlier this year. “The duct tape starts to show,” he said.

Keep Reading Show less
Kate Kaye

Kate Kaye is an award-winning multimedia reporter digging deep and telling print, digital and audio stories. She covers AI and data for Protocol. Her reporting on AI and tech ethics issues has been published in OneZero, Fast Company, MIT Technology Review, CityLab, Ad Age and Digiday and heard on NPR. Kate is the creator of RedTailMedia.org and is the author of "Campaign '08: A Turning Point for Digital Media," a book about how the 2008 presidential campaigns used digital media and data.

Latest Stories
Bulletins