The transformative devices of tomorrow are within Arm's reach.
Arm announced its latest slate of chip designs (the Cortex-A78 CPU, Mali-G78 GPU and Ethos-N78 NPU, for those keeping track) on Tuesday for products that will likely start hitting shelves in 2021 — assuming we're allowed outside by then. These designs, which SoftBank-owned Arm licenses to companies like Qualcomm, Samsung, Nvidia and Apple, power the processors and graphics cards found in mobile devices around the world, and increasingly new product categories, like VR headsets, 2-in-1 laptops and smart-home devices. They're all higher-power, more energy-efficient and more performant than the models they replace.
Arm's second-generation Ethos neural processing unit, offers a mobile processor dedicated to machine-learning tasks that's 25% more performant than last year's model, the company said. This might not sound like much, but Paul Williamson, Arm's VP and general manager of client devices, tells Protocol that it could provide the intelligence to efficiently power sensors in cutting-edge devices that help seamlessly blend the real and virtual worlds — and help out with battery life, too. "It's a very critical workload for things like VR and AR where you're trying to detect objects in the world around you," Wiliamson said. And given we're all stuck at home, immersive worlds to escape into sounds like a welcome distraction.
Protocol recently spoke with Williamson to discuss what these new chip designs could mean to the future of mobile devices, from new VR headsets to tomorrow's smart TVs, and designing the future during a pandemic.
This interview has been lightly edited for length and clarity.
Do you feel that these designs, especially this new NPU, have the power that existing VR and AR systems may be lacking? Are we getting to the point where devices are thin and efficient enough that people would actually use them for long stretches?
I think the shape of compute, to make it truly wearable, is going to require us to tether the headset for some time yet. I think you'll see a lighter-weight front end that's very much focused on driving the display and the sensors, but probably with a tethered compute engine supporting that. I think this platform of IP is really well structured to support a lot of innovation in that tethered structure, whether that's just by plugging into the smartphone or a dedicated puck [like Magic Leap has] that goes with a discrete product.
Oculus Quest has shown that people are getting more excited about VR. An untethered experience does offer some freedom, and this brings more power to those kinds of platforms, but it's a balance. I think for augmented applications, you'll still want an all-day wearable to be lighter than something like a Quest. And while this [NPU] drives efficiency and performance together — which means you can imagine that a Quest or a full VR headset gets lighter in terms of the needed battery for the same performance or it can drive more performance for the same weight — a physical display and a battery have some real weight to them. And those things aren't changing overnight. So, we're probably going to see a mixture of form factors.
Smart TVs are starting to feel ubiquitous. Are there other devices in the home that are going to be made smart at the same scale as our TVs?
I think it'll be interesting to see how we come out of the current pandemic in terms of our relationship with technology in the home: The smart assistant, whether it becomes a richer, more visual experience rather than just a voice interface to the cloud; whether we expect more videoconferencing in our lives more generally. There's definitely some interesting trends that this technology platform would certainly support if culturally we become more interested in having that capability. I think that's true of the workplace as well: We're going to become more interested in virtual presence in meetings and have more expectations around visual content there.
In [smart TVs], though, I think there's quite a long way to go. We're seeing interest in bringing more gaming capability, improvements in resolution even further to the 8K TVs announced at CES. Both the richness of the applications on digital TV and the performance demand of having 8K-type user interfaces and content means you absolutely need a stronger compute capability underneath.
It feels like we're getting to a point where my ostensibly mobile devices can do most of the tasks my desktop computer can do. How does Arm think about the convergence of these two markets that have been traditionally separate?
Actually I'm taking this call to you on the Microsoft Surface Pro X, which is an Arm-based Windows device, and it does cross that boundary of being something that I can pick up and use like a tablet, but also dock it with my large screen and use it like an enterprise device. It looks very at home running the full Office suite. And that's all powered on this Arm IP that we're describing here and allows you to build applications that are targeted for Arm, whether it's a smaller screen, a larger screen or a really big screen that you're plugged into.
So definitely we see that happening in real-world devices today, the Surface Pro X being a great example. And I think it's a trend that supports developers as well. If I want to address as many people as possible with something — like videoconferencing — I want to be able to do that across all form factors so they can do it from their garden, their living room or their office without having to think about these things. At Arm, we're definitely trying to make sure that we serve the developer, and multiple form factors, rather than thinking of this as a smartphone problem.
What do you make of the reports that Apple is looking at putting Arm chips in its laptops? Is that a further confluence of that trend?
We're not at liberty to speculate about what they might be doing, but I think the fusion between tablet, laptop and smartphone is clear and has developer benefit and it has user benefit. On the general theme, it's a positive step.
Are there things you're already seeing in the way that people are using existing products during the pandemic that are influencing the way that you think about tomorrow's products?
Our goal is to provide the compute platform that everybody needs for the future. So we've looked at this quite a bit and said, "What's changed?" I think what we've noted is that perhaps it's accelerated themes that we were already working on and fusing into the platform. We always have to work on quite a long timeframe — the IP that we're launching today was developed some time ago and it appears in silicon sometime next year in the hands of consumers. So we're typically looking at a three- to five-year timeline with our roadmap anyway. And I think some of the things that it accelerates is the need for performance across multiple different workloads. It's going to be a confluence of how all those bits perform together. That is a thing we see reinforced by some of the new ways people are using devices — wanting to run videoconferencing with multiple views open, at the same time as wanting to scroll through applications or have other things going on in their lives around them.
The other thing that I think this really validates is our focus on security. If we're going to use these devices increasingly for services like medical or personal data, making sure we've built a platform that can be as secure as possible for the developer and for the user is really key. So it's reinforced some of the investments we've been making for a while, and we'll expect them to have increasing demand in the future.