Radar, ultrasound, AR: Why weird tech matters
Good morning, and welcome to Protocol Next Up. This week's edition is all about quirky and cutting-edge tech for smart devices and phones, and what it means for developers and entertainment companies. We're talking radar, ultrasound, displays that follow your voice, gesture control, AR, 5G and something called multi-access edge computing. Strap in, it's gonna get weird.
(Was this email forwarded to you? Sign up here to get Next Up every week.)
The Big Story
What do new smart display sensors and features mean for developers?
The past few days, my house has felt a bit like a lab. Google's latest Nest Hub has been shooting radar beams at me at night, measuring my sleep quality and breathing rhythm. And after getting up, I've had conversations with Amazon's Echo Show 10, which constantly rotates its display in my direction when I speak, and does a little happy dance whenever I tell Alexa I love her.
I've been testing these devices for a few stories, including a deep dive on Amazon's decision to add motion to its latest smart display. But I've also been curious how these new sensors and features change the game for third-party developers, including media companies looking for the next big thing.
- Amazon's Echo Show 10 features a display that can rotate 360 degrees around its speaker base to keep it in view at all times, and the company has choreographed a handful of different motions for developers to add to their skills. Some of the media companies taking advantage of motion for their own skills include Comedy Central, Sony and Universal Games.
- Google's new Nest Hub makes use of the company's Soli radar sensor. In addition to monitoring your body during nighttime hours to analyze your sleep, Soli is also being used for some rudimentary gesture control, allowing you to snooze an alarm with a wave of your hand. "Soli can measure movement on the micro scale and the macro scale," Google Nest senior product manager Ashton Udall told me last month. Google may add more Soli-based features over time, Udall suggested. "There are a lot of ideas on the whiteboard."
- In addition to Soli, Google has also been using ultrasonic audio to detect presence and proximity on its Nest Hub and Nest Mini devices for some time. Walk up to a Nest Hub running a timer, and the visual style changes to account for the decreased distance.
One of the big differences between Google's and Amazon's approach: Google has not opened up these technologies to third-party developers yet, and my sense is that this won't change anytime soon. Amazon, on the other hand, has been a lot quicker in making advanced features available to developers, including motion for the Echo Show 10. "Developers are our partners in the making of these experiences and their widespread adoption," Amazon's Alexa devices and developer technologies VP Nedim Fresko told me.
Developers clearly like that approach. "We always want to lean into what the future is, and what the next exciting thing is," I was told by Max Child, the CEO of voice game startup Volley. "I'm glad people are adding more features and more sensors to these kinds of devices."
Child also suggested that device makers should give third-party developers more direct access to the cameras present in many smart displays, with the right privacy safeguards in place. The furthest along in this area is Facebook. After announcing plans to let AR creators target new devices at its Connect conference last year, the company is now doing alpha tests of third-party AR effects on its Portal devices, with a closed beta planned for the near future.
Screens, radar, ultrasound, motion, gestures, AR camera effects: What unites all these technologies is the idea that smart devices become more useful when they offer a wider variety of ways to interact that go beyond voice-only commands.
Early data seems to support this, with Amazon telling me that so-called multimodal skills see more than three times the number of monthly users than voice-only skills on multimodal devices (meaning smart displays). Skills that have added video on these devices have on average seen around double the engagement over voice-only skills.
Yes, but: There are some reasons to be slow and deliberate about allowing developers to add these kinds of new features. Privacy is an obvious one. Already, there are fears that Google's Soli sensor could track people's sex life (the company says that is not the case).
There may also be an issue with fragmentation if a wide variety of devices gets all kinds of different sensors and capabilities. Fresko told me that Amazon treats things like motion, and even displays in general, as optional, with voice being the foundation that works across all smart devices. "We definitely consider this, and kind of architect things such that the layers are easy for developers to understand," he said.
Third-party developers might only have limited resources, and making a display wobble might not be on top of their list. "I have to make sure whatever we are working on is going to pay off in the near term," Child told me. Still, he was excited to experiment further. "It's a new frontier and greenfield opportunity to try out new types of experiences, new types of games, new types of entertainment," he said.
Overheard
"There is a tendency to be like, 'Oh, these are the three people who made VR happen,' or whatever. But the cool thing about consumer electronics is it takes thousands of people." —Facebook head of VR hardware Caitlin Kalinowski, reflecting on the history of VR as part of a great oral history of the five years since Facebook launched the original Oculus Rift headset to consumers.
"People should realize that we've come a long way and we've done a great job — but this road stretches out for the rest of their lifetimes, and that's the most exciting part!" —Facebook Reality Labs Chief Scientist Michael Abrash, as part of that same oral history.
JOIN US

Technology has been the leading sector in trust since Edelman began its Trust Barometer 21 years ago. Since that time, trust in business has risen while trust in technology has declined – and this year, the decline has been dramatic. Join Edelman for a discussion with tech industry leaders on what's next for Tech & Trust in 2021. This event is moderated by Protocol.
Watch Out
Niantic's latest is a showcase for 5G AR
The maker of Pokémon Go unveiled a new multiplayer demo called Codename: Urban Legends this week that's all about using 5G for the future of AR. The game allows multiple players to hunt and battle a variety of monsters together with their 5G phones, and will be available to demo in stores of select mobile carrier partners soon.
Niantic has shown off social mobile AR in the past. In 2018, the company invited a bunch of journalists to its offices in San Francisco, where we got to play a weird and fun AR laser tag demo. However, that demo ran on Wi-Fi, using a custom-built P2P networking technology, and may not have worked quite as well via plain-old 4G cellular connectivity.
5G changes that. The new mobile networking tech has long been touted as faster, but whenever I talk to developers, they instead point to two other key features: lower latency and the ability to serve many more users at the same time.
Niantic repeated those points in a blog post this week, claiming that 5G helped the company to reduce the latency of Codename: Urban Legends by a factor of 10, while allowing it to connect up to 10 times as many players. That's enabled by a technology called multi-access edge computing, or MEC, as Niantic pointed out in this video.
Niantic is primarily touting this technology, and its partnerships with carriers around the globe, as a way to enable the future of AR gaming. However, the company has also been busy working on things that could go beyond gaming, with COO Megan Quinn telling me last year that we should expect non-gaming applications from both Niantic itself and third-party developers using the company's platform.
"We expect that the experiences built on the Niantic Real World platform will be well outside of gaming, and we're excited to see what developers come up with," Quinn said at the time.
Coincidentally, Niantic's CEO John Hanke this week also offered a first glimpse at what appeared to be the AR headset reference design the company is working on together with Qualcomm. A small snippet of a device tweeted by Hanke immediately led to all kinds of speculation about future Niantic-branded AR consumer hardware, but Quinn already shot those rumors down: "We are not building our own hardware, nor do we have plans to," she told me last year.
Fast Forward
- On Protocol: T-Mobile is already shutting down its live TV service. TVision will go away at the end of April, half a year after its launch.
- Quest 2 has sold better than all other Oculus headsets combined. This includes Rift, Go and Rift 2. Now if Facebook only gave us some real numbers …
- Why it's hard to make sense of media company valuations. Viacom's stock lost half of its value over the past two weeks. What's up with that?
- Sports streamer DAZN pays big for Italian soccer games. DAZN, which is led by ESPN and Disney alumni John Skipper and Kevin Mayer, is shelling out almost $1 billion per season for the Serie A league.
- Snap may announce new AR glasses in May. The next version of Spectacles may come with actual AR overlays, but it might be sold only to developers.
- Apple leads investment into a music distribution company. UnitedMasters raised $50 million in new funding for its services targeting independent musicians.
- U.S. Army orders 120,000 custom AR headsets from Microsoft. The contract for HoloLens-based headsets is worth close to $22 billion over 10 years.
- Technicolor has deployed over 10 million Android TV set-top boxes. One of Google's not-so-secret weapons in the smart TV platform space has long been its operator business.
Auf Wiedersehen
I've got one word for you, dear returning reader: tomorrow. If that sentence made you feel like you're missing some context, then you probably didn't catch my daughter's riddle in last week's Next Up — and you may get a sense of how I felt after reading an article about the mafia fugitive who got arrested after posting cooking videos to YouTube. What did he cook? Were his videos any good? Did he have a loyal online following? Was he trying to monetize his skills, and become a famous YouTuber? How many other YouTube videos have been produced by bored criminals in hiding? Should the FBI have an influencer protection program? Has anyone ever been under TikTok house arrest? So many questions that we may never know the answers to. Still, from now on, I'm going to imagine that every YouTuber is secretly a criminal mastermind. I mean, seriously: What is Lofi Girl working on, if not a long-winded confession?
Thanks for reading — see you next week!
Recent Issues
2021 was a post-cable year
December 23, 2021
Loop is reinventing the jukebox, one screen at a time
December 16, 2021
The future of AR is being decided now
December 09, 2021
How b8ta is reinventing QVC, with a retail twist
December 02, 2021
Swerve is reinventing sports TV for cord cutters
November 18, 2021
Will your next TV really be a $1 AR app?
November 04, 2021
See more
To give you the best possible experience, this site uses cookies. If you continue browsing. you accept our use of cookies. You can review our privacy policy to find out more about the cookies we use.