Intel is flooding the skies with drones, doing everything from sending them out to inspect massive solar arrays in the Mojave Desert to lighting the night sky above Disney World with 500 drones.
It’s all part of a virtuous cycle, according to Anil Nanduri, vice president in the new technology group and general manager of Intel’s unmanned aerial vehicle (UAV). The world’s biggest chip maker has moved beyond chips to focus on great outdoor experiences for users.
And drones are a good market because they use a lot of technology, including Intel’s RealSense depth cameras, and they produce an enormous amount of data. That data keeps the servers in data centers humming, and that creates more demand for Intel’s processors.
Nanduri is in charge of keeping that virtuous cycle going, and I talked with him about that, and more.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Here’s an edited transcript of our conversation.
VentureBeat: How did you get into the drone field?
Anil Nanduri: When we started with RealSense, we were working on PCs and tablets. We were trying to see how we could change the computing interface with gestures and other use cases. As we were exploring, we had two versions of RealSense cameras: one, the front-facing, was primarily for PC interaction, and then we had the world-facing, which we designed for tablets. That can see further away. The world-facing camera, which was depth-sensing, also had the ability to work outdoors.
Then we said, “Wait a minute. This depth-sensing camera could be used in other fields besides PCs and tablets. How do we apply that?” The use cases around scanning for printing, or robotics for collision avoidance, or drones for collision avoidance — that’s what the ideation triggered. Where could we apply this beyond PCs?
It was part of that exercise, looking at RealSense capabilities beyond PCs and tablets, that got me into drones. How do we apply this in a flying platform? If you recall, at CES 2015 we had two demos on stage. One was a media conferencing robot navigating using RealSense. The other was a drone with six RealSense cameras.
VB: To step back a bit, what was the thinking around going into drones at all? Intel is historically an ingredient-maker, a chip maker. This is making this whole dish. You guys don’t make that move in every case where your chips are used.
Nanduri: We’re part of Intel’s New Technology Group. This was formed to scope out new opportunities and new markets and different ways we could operate within that domain. We’re not held to any kind of engagement model where we only have to do ingredients. We have that ability to innovate.
When it came to drones, we started with the RealSense technology. We worked with Yuneec in the consumer industry and got RealSense collision avoidance in the Typhoon H drone, which is more of a pro-sumer standpoint. We made some acquisitions as well — Ascending Technologies, and later MAVinci. What we realized is that we own some amazing technology now, including full commercial-grade systems. These had very high redundancies built into them, very accurate, high-precision systems. For commercial applications, like construction inspection, where you need high accuracy reconstruction of your point of interest, these systems were deployed and widely used — not in the U.S., but in Europe and other markets. For enterprise use cases and commercial use cases, the capabilities that are needed come to be much more rigid.
We said, “Hey, we have a good value proposition here. There’s great demand in this space. Why should we be encumbered by history?” This was a completely green field.
VB: If you have a mastery of the technology already, it makes sense.
Nanduri: Right. The other part of it, even though Intel’s model has been traditionally making ingredients — you know very well that we’ve done a lot of system development, even up to PCs. We develop a lot of capabilities. We just don’t productize them. We have the knowledge to build things end to end, so we took the extra step to brand it and sell it as well. That know-how, building end-to-end systems, is very instrumental in pushing the technology and innovation forward.
VB: Brian has mentioned this before, that these are not just things that use Intel’s chips. They’re also products that generate an enormous amount of data.
Nanduri: Correct. That’s where it connects to our virtuous cycle. Each drone flight, depending on the payload you’re using and a rough map, each frame that you’re capturing with a high-resolution sensor or camera is about 25 megabytes. If you take 200 images, that’s five gigabytes. If you take 2,000 images, it’s 50 gigabytes. That’s one flight. These are the standard file sizes you deal with in the space. Then you have the processing behind it, applying tools and software that need a lot more compute.
It clearly fits into our virtuous cycle, the next set of machines that are going to be generating huge amounts of data: autonomous cars, robotics, virtual reality. Drones fit into the domain of huge data sets beyond what consumers can typically generate.
VB: Having data itself is considered a good thing or a priority these days, as far as making good use of all the servers out there in the data centers.
Nanduri: Right. But the workflows are different, how you apply them and deploy them. It’s good to have an end-to-end understanding of the workflow. These are new use cases. We’re just seeing the tip of the iceberg right now in how they can be applied. Talking about bridge inspections — there are more than 600,000 bridges out there in the United States. Think about manpower and individual safety. It’s dangerous to put all those folks out there on cherry pickers or harnesses. These machines can do the work more quickly and reliably. They can automate it in easier ways. The value prop becomes a no-brainer at that point.
If people look at how inspections are done today, people literally climb towers on harnesses, sitting there doing visual inspection. You don’t need to do that anymore. You still need to send people in to fix things, maybe, you can reduce the number of times a person has to go up there, and you can inspect more often. The later you find a problem, the more expensive it is to fix. You can build a database to create those inspection reports more often, and then analyze and use compute to check for inspection issues earlier. Today you can’t do it as often because it’s expensive and unsafe. With drones you can do it more often, and at a fraction of the cost.
The question, then, is how do we get there? What needs to happen to have that become a widespread use of drones day to day?