
'Write once. Deploy anywhere' - Intel’s ode to developers
An interview with Bill Pearson, VP and general manager of Developer Enabling in the Internet of Things Group at Intel.
An interview with Bill Pearson

Bill Pearson, Vice President and General Manager of Developer Enabling at Internet of Things Group at Intel
The pandemic and the world's gradual emergence from its varied enforced lockdowns have created a surge in demand for IoT applications. But for the developers who strive to bring solutions to market quickly, the challenge to overcome complex requirements, unique connectivity needs and disparate edge infrastructures can impede progress and create delays.
Intel's decades of experience working with developers have made clear the important role they play, how crucial it is to support them in unprecedented times and how technological advances (such as machine learning) create even more opportunities for "magic" to happen, according toBill Pearson, VP and GM of developer enabling in the company's Internet of Things Group.
Protocol sat down with Pearson, who leads a team of engineers and business managers who are focused on ensuring developers receive the very best experience from Intel. We discussed the state of their developer community, developers' biggest pain points and how better access to AI tools is making their lives easier so they can focus on the creativity that makes for great solutions.
Why does Intel see developers as so important now and for the future?
Developers provide the magic, or the spark. I've been working at Intel on developer solutions for 24 years in some fashion, and when I think about the hardware that we provide, what our customers ultimately want to do with that hardware is turn it into solutions and solve problems, or essentially create new magic. By extension, developers are the ones who get the privilege of doing that. So, it's often their creativity, their innovation, their creation that results in these great solutions in the marketplace. And it's my privilege, along with a team of engineers and business managers, to create the tools that help enable that magic.
What is the current state of the developer landscape?
The cloud continues to present an exciting opportunity for developers, particularly those who are focused on getting into cloud native technology, containers and orchestration. But there's also a rich history of embedded developers that have been working on IoT or real-world scenarios, such as moving a robot or adjusting a weld, for some time. And now, we see those two worlds of IoT intersecting at the edge, and that intersection creates a neat opportunity to do a few things.First, it is great to be able to get a rich set of real time or near-real time data that shows what's happening in say, a factory line, in a retail store or in a hospital. Next, we can apply artificial intelligence to figure out how companies should use that data. And thirdly, we can process that data and apply AI in a way that leverages all that cloud innovation that we've seen in the past few years.
What are the biggest trends you have been seeing over the last year with regard to the developer community?
There are a series of trends that start coming together. While AI in concept has been around for a while, we've seen it really take off recently as we've had the compute power to process the data and information to apply the AI. Secondly, large data sets really help make AI possible, and I think we're going to see trends in the future where AI is going to be more accessible. If you look back in time, developers were a rare breed. Over time, development has become more accessible, which has resulted in more developers — and that's a great positive feedback loop that I think is going to continue in the future.
The introduction of 5G into all of this is another major positive trend. 5G added to the edge and AI brings an opportunity to deal with the data where it sits. Now, we can get that data closer to that factory, to that retail store, to that hospital — all while knowing that we've got the right latency, bandwidth and reliable connection that maybe we didn't have before.
"We cultivate relationships and help design products, tools, code samples -- all kinds of resources -- to make developers' lives easier."
Why do you think the edge represents a particular opportunity for developers?
The opportunity is really dealing with the data where it sits. Imagine a scenario where I collect all this data from the sensors in my factory. You could try to send all that to the cloud, and that's where your compute and development might happen. That has huge implications. Maybe I don't have persistent connectivity. The bandwidth costs of that are enormous. Imagine sending all these video streams from hundreds and hundreds of cameras from an installation to the cloud. Or there are security/privacy concerns and I want to keep that data on site.
So, there are lots of reasons why bringing all that data somewhere else doesn't make sense. But doing it at the edge where the data is today, if I add compute there, now I can analyze all that data and then I can take action without having to send the data anywhere. Now, we bring the compute to the data.
We've been working with Audiaround analyzing weld data. When they build a car, they do about 5,000 welds per car — all run through their line. The old way was for someone to go in and inspect that one car per day out of their whole line and say, "Well, how'd we do? Are the welds good or not?" Then that would be used as a representative sample for the other cars processed that day.
Now with AI, they've been able to do a real-time inspection of each of those welds — they're inspecting every weld. Now, they're able to know for a fact they've got great quality out of every car, every weld, every time.
Or, when they see that they don't have the right quality, they can also take action on that data in near-real time. They can adjust the mix of chemicals in the welding machine or make other real-time adjustments on the line as they're building that car. So, the opportunity for developers at the edge is to have much more real-time interaction with all of that data as it's happening.
Let's shift gears to retail. If, for example, I've got a trained model looking at foot traffic patterns [and] I'm running that model or algorithm in my retail store, then I'm learning, as people are coming in, so I'm updating and changing it all at the edge without having to go anywhere else.
There are so many examples like this where developers have a new frontier, if you will, to work at the edge.
How does Intel work with the developer community? What are some of the biggest pain points you have heard from developers?
Intel has been working with developers for decades. They are just so key to all of the work we do, regardless of the technology area, that we cultivate these relationships and help design products, tools, code samples — all kinds of resources — to make the developers' lives easier.
Recently, we've been focused on a product calledOpenVINO, which is a resource that's focused on edge inference, and making that as simple as possible.
We started with the notion of: How do we make sure that developers can write once and deploy anywhere? In the old way of doing things, the developer might have had to create custom code for each one of those pieces of silicon. We don't want them to do that. So, we worked hard to create OpenVINO, which lets developers write one time, express their intent and then deploy that code across a variety of silicon from Intel.
And the product has just been getting better and better. Our original focus ofOpenVINO was computer vision, because that's where a lot of edge AI happens today. But we know developers are expanding to different use cases, so we've been expanding that portfolio to cover things like audio, and we've introduced different activities like theDeep Learning Workbench, and this helps dive into performance, giving developers another tool to figure out what's happening at the edge.
Another thing that we've taken into consideration is silicon selection. We heard from developers: "I know OpenVINO can solve all my edge inference applications, this is great. But what's the right silicon choice for me?"
The answer isn't all that simple. You can have some general rules of thumb, but ultimately you need to do some investigation. So, we've deployed the tools like OpenVINO and the Deep Learning Workbench on our IntelDevCloud for the Edge.
What that means is that rather than having to buy a bunch of silicon and try it out, we put all that out there in the cloud and we've connected it to OpenVINO. Very quickly a developer can come in with their applications, their model, and run it against any of this edge hardware and find out how it's going to perform and which hardware's going to serve them best given their constraints.
This has been a game changer. It's really enabled developers to much more quickly understand the implications and adjust accordingly.
The other pain point that comes up today is complexity of AI. So, we're exploring how to get more pretrained models for developers so they can leverage that farther. This is all part ofIntel's commitment to the democratization of AI. We're working to make AI technology available to more people so it can be more easily adopted.
You've been doing this for a long time. What lessons have you learned that help developers focused on AI and edge solutions?
We've been helping them with easy on-ramps to the technology; I mentioned those reference implementations as an example. Every developer wants a code sample, and we've been doing that forever, but they also want something they can apply.
We found that if we put the code samples together in a practical way that solves some problem — let's say we're going to build a traffic management solution. We take technologies like networking and edge inference, and we bring them together for the first time in a controlled way. By doing that, we've essentially done it for you, and we're showing you how to do it next time. So, we put those tools in the hands of developers and then make it easy for them to deploy those on their own and try them out in our environment. With that, they're really able to come together and quickly and easily get started on that first application.
Many use cases and reference implementations are already available. TheIntel® Edge Software Hub provides use cases and reference implementations via pre-validated software packages to quickly experiment, test and create.
What are some of the most exciting and innovative use cases you've worked on recently?
One that comes to mind isHitachi, and they were looking at applying AI to health care imaging — applying AI to CT scans, X-Rays or MRIs, for example. One of the things they had to go through was: "How do I figure out what the right building materials and ingredients are for me?" Again, we're trying to balance performance, power and cost at the edge — and withDevCloud they're able to go in and make tweaks in real time to come up with a great solution.
Another one is aroundGeekplus, which creates logisticsand inventory control robots — so think about someplace like a warehouse where they're always reconfiguring shelves — and they wanted to determine what silicon architecture was the most effective by testing equipment samples in their physical lab, and that was going to take significant time and budget. Geekplus was able to figure out what hardware they needed to build on to deploy their solution. They were able to test various combinations in DevCloud and they ended up choosing a combination of CPUs,Core i3s andMovidius VPUs for their application.
The interesting thing for me is that not only did they save development time in this case, doing it in a matter of only a few weeks, but they also saved cost. A lot of times the time-saving is more valuable, but it's always great when you can get both.
Looking forward, what are your predictions for the next year?
I think we're going to see both a continuation and acceleration of the trends we talked about today. I recently read something that said we're just getting started with the cloud. The edge is even more nascent than the cloud, so over the next year I think you're going to see more and more developers building applications at the edge, specifically designed to run where the data is.
We're also going to see more intersections of technology trends. We talked about AI and 5G — now imagine the convergence of those. That's super powerful, and we're just at the beginning of that. And that means we're going to see more applications that you and I didn't even think of in the past.
And the last trend is thedemocratization of AI. I think you're going to see AI become more and more simple. There are companies today that are saying, "How can I make AI a low-code/no-code or drag-and-drop type environment?"
This will have the effect of expanding the number of people that can play in AI, and I think that's going to drive more implementation and innovation at the edge as it happens.
When we talk about the future, people always want to know how far away we are from being autonomous.
When you look at technologies of AI, 5G and real-time control, all of those are laying the foundation for "autonomous." While people like to talk about autonomous driving, we have that really interesting example of robots inspecting a weld. When I've got 5G there and I've got a private network that's managed to configure the factory floor, I can in real time reconfigure the whole factory based on what is coming in that line. That's a semi-autonomous robot taking in real-time data, processing it and making decisions.
So, I think that the convergence of real-time control to AI and 5G, that enables more autonomy — over time, I think we will definitely see more of that in the future.
What impact did the pandemic have on these trends?
The pandemic showed us the way technology can add value to people's lives. In fact, in April 2020, Intel launched thePandemic Response Technology Initiative. The Initiative sought to provide a 360-degree view of the challenges ahead, focusing on how our technologies can enhance health care, education and the economic recovery of businesses at multiple levels. Our goals were to provide immediate relief where it was needed most, develop innovative solutions to support the new normal and invest in technology that would limit the impact of future crises. Nearly every piece of Intel technology was leveraged in some way. Twelve months later, the scope of PRTI work includes 230 projects spread across 170 organizations — and now Intel is continuing this work with theIntel RISE Technology Initiative, which will continue to review and fund projects related to health care, education and the economy with new dedicated work streams for social equity and human rights, accessibility and climate action.
Looking at health care specifically,we have countless examples of companies that are transforming the medical space. In the past year, health care innovators applied modern technology to change the way that care is delivered, data is tracked, vaccines are created and deployed. That need isn't going away. During the last year, it was amplified, but not just in health care — all industries thought differently about their business and how to use technology in new and interesting ways.
And now, we're seeing how developers, companies and society are going to continue doing that trend to make the world safer, more productive and just to enrich our collective existence as we continue with this exciting adventure that we call life.