It’s no fun when you fire up a heavy-duty game on your tablet and it starts to get warm. Imagination Technologies, the chip design company that owns the MIPS processor and PowerVR graphics technologies, wants to create a future where tablets are both capable and power efficient.
We should all hope that it succeeds, because we’re going to want our tablets to handle increasingly difficult workloads, like figuring out hazards on the road or taking data from sensors and making it meaningful.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1646236,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"D"}']We talked with Pete McGuinness, director of technology marketing at Imagination, at the recent 2015 International CES. He didn’t hold back his opinions on the right way and the wrong way to approach this critical computing problem.
Here’s an edited transcript of our talk.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
VentureBeat: What do you have on your tablet?
Peter McGuinness: This is a piece of software written by a partner, a company called Luxoft. What it’s showing is a proximity warning and a lane departure warning. When you drift out of the lane or change lanes, it gives you this red warning. It gives you the distance to the vehicle ahead and things like that.
The framerate is the nice thing. We have a framerate counter here and a central processing unit (CPU) utilization counter here. Two things are going on. First of all, if you like, it’s a heterogeneous app. It’s running partially on the CPU, but the heavy lifting on the image processing is being done by the graphics processing unit (GPU). The underlying algorithms have been ported to OpenCL. It’s on Android, with an Intel Atom. It has a four-cluster Rogue 6-series GPU in it.
VB: And this tells us what?
On Android, if you’re doing imaging of this type, you’re passing video buffers between the various hardware components. You import it from the camera, decode a bitstream, and that creates a buffer with the image data in it. Then you have to take that buffer and copy every frame into an area of memory owned by the GPU, in this case. It could be a video encoder, for videoconferencing, or a display controller if you’re just going to put it on the screen. But the point is that with every movement between the different hardware blocks in the SOC [system on chip], Android tries to make a copy of the data. That turns out to dominate performance on apps like this.
What the imaging framework does, we’ve extended some of the APIs in EGL, the buffer management in EGL, and we’ve added some utilities. It doesn’t completely eliminate buffer copies, but it minimizes them. Instead of having six or seven buffer copies, which is easily possible if you just use stock Android, it’ll go down to a single copy, which makes something like this possible on an embedded system on chip. You can get the framerate and you don’t overheat the device.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1646236,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"D"}']
You’ve seen the Nvidia Tegras are big in autos, right? One of the reasons for that is because in a car, you can put a fan on top of the device. That’s the limitation in most cases: the power dissipation. In a form factor like this, without OpenCL and without the buffer — we call it Zero Copy technology — this would just go into thermal shutdown after just a few seconds. Instead, this will run indefinitely.
VB: Is that the Dell Venue 8, the brand-new one?
McGuinness: Exactly, that’s right. This is just one example. It’s a device that’s already launched. We have another tablet we can’t show publicly from another manufacturer, with the same chipset. What they did was, they used our Zero Copy and the imaging framework to port a lot of the computational photography and image processing tasks onto the GPU. Of course they want video filters and Instagram filters and things like that.
VB: Is this going to get hot as well?
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1646236,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"D"}']
McGuinness: Feel it. It’s barely using the CPU.
VB: I use an Nvidia Shield, which can run very hot.
McGuinness: Yeah. But not to get into Nvidia bashing. Or why not? I mean, Tegra is not really a mobile device. It runs too hot. It’s just not suitable.
There’s a couple of messages attached to using GPU compute. The first one is that for highly parallel tasks like this, it’s a power optimization method. You can get much better performance at much lower power. We’ve seen instances where we’ve multiplied the performance by a factor of six and divided the power by a factor of 10. That’s 60 altogether, just by moving onto the GPU. When you’re streaming video, the architecture of the GPU is so much more appropriate for that sort of task.
[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":1646236,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"D"}']
This image framework was the thing we found was necessary to make that feasible in Android, because of the way Android tries to manage buffers. That’s why we put it out there. The way we’re deploying it, we’re working with OEMs who want to differentiate their products. This is something laying on top of Dell. But this other tablet manufacturer — I don’t know when they’re going to launch the thing. What they’ve done is taken our imaging framework, taken the standard camera app for Android, and extended it to differentiate their tablet from everyone else’s. They’ve added camera features like anti-shake and face detect by using that framework, and also added other fun effects like face beautification, or this app that will enlarge your eyes. Apparently in China they think that’s a wonderful thing to do.
This is where we see the technology going. Everyone’s talking about GPU compute, heterogeneous compute. HSA [heterogeneous system architecture] is coming along, all that sort of thing. This is where we see it going. It’s going to be in visual imaging applications. It’s usable in Project Tango, in Glass, in lots of other image-based appliances and other things out there.
We have a really nice demonstration at our booth based on a camera sensor. It’s mounted up in the ceiling and mapping an area about the size of this room. When someone walks into it, it maps them and tracks them and detects where they go. It’s a point-of-sale monitoring thing. They can see when people walk to the register, what they look at, whether they pick something up and buy it or just walk away without buying anything. The intelligence for that is shared between the camera and the point-of-sale terminal, but in any case it has to be using a consumer-level IOT device rather than a big PC. That’s where we see a lot of this technology going.
VB: Do you think that for things like augmented reality glasses, that there’s some sort of preferred platform yet? What kind of graphics do you need to make that acceptable and cheap? Some of these glasses come from the military, so they’re really high-end, around $5,000.
[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":1646236,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"D"}']
McGuinness: And they don’t work very well. Oculus Rift is much cheaper and much better. The first-generation Glass was kind of disappointing. The screen is tiny. It’s limited in functionality and it really has no graphics, apart from just putting up messages in front of your eye. It’s definitely not augmented reality. It doesn’t overlay things on your view. You need a pair of glasses with a screen that covers your entire field of vision.
VB: I saw something from Osterhout Design Group recently. They’re one of the military designers, doing night-vision goggles and stuff. They just announced a consumer version coming out this year that’ll be under $1,000. It’s a regular pair of glasses with screens on the lenses that do overlays and other sorts of things while you’re still able to see everything.
McGuinness: We work really closely with the people at Oculus. John Carmack is a big fan of ours. We help them a lot with the graphics for the Rift. The major thing in making it usable — you’ve heard of the effect where, if you have full VR and there’s too much lag between your head movement and the display reacting, it makes you nauseous. Reducing that lag between the action detected by the sensor and the reaction on the screen is critical. John wanted to chase the raster, actually, to not even have a single buffer. That reduces it from milliseconds to microseconds of delay, which is what you need.
In terms of graphics, though, the thing about augmented reality is that you want minimum graphics. You’re not creating the entire world. If you put too many graphics on it, you destroy the information. You want the minimum of information on there.
[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":1646236,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"D"}']
The exception is if it’s something like what Metaio are doing, where you look through a screen and can take an object and locate it in the real world. In that case you want really good graphics. You want to probe the light sources in the environment, take that object, and light it with the light sources that are actually in the environment. It’s quite a hard problem. That’s one of the directions we’re going to take with ray tracing. Ray tracing makes that happen in real time. If you want things properly lit and casting shadows and reflections happening from the environment, doing it with ray tracing is the only way of possibly doing it in real time.
You’ll see that in point-of-sale and things like e-catalogs. Ikea is already doing this. You can get an Ikea catalog and, with your iPhone, if you look through the screen at the catalog, it’ll detect the QR code or something else in the page there and do something. In some cases it’ll run a movie. In other cases the object you’re thinking about buying will pop up in 3D. You can take that in all sorts of directions.
VB: Are you guys into wearables, like watches?
McGuinness: We are, actually. There’s one particular company called Inada, which is using MIPS and PowerVR exclusively in its IOT range. We have SGX in the watch. We’ll be seeing those soon. Those watches are fantastic because they have a battery life of six days. That’s pretty impressive in a smartwatch. It’s the longest battery life I’ve ever seen. MIPS is one reason for that.
ARM’s dirty little secret is that they’re not a low-power company. Everyone says, “Hey, ARM are the low-power kings!” Well, have you ever seen the power consumption of the A15? The MIPS architecture is ultimately more efficient than ARM.
The other story — the theme of the moment, if you like — is security. Everyone’s very concerned about security. On the MIPS side of things, it has hardware virtualization. We can create multiple secure zones. We’ve also done that in the GPU, so now we have hardware virtualized GPUs, and we also have a secure fabric. We have a complete CPU, GPU, and fabric system which is completely secure.
It’s more secure than TrustZone. With TrustZone you can have a single trusted zone, which means that all your trusted apps have to go into that one zone. A lot of people don’t want that to happen. Netflix does not want to be in the same trusted zone as anything else. With MIPS, because of the hardware virtualization, you can create trusted zones that are separate from each other, isolate all of your different apps, and keep them secure. That’s a differentiating factor between MIPS and ARM.
A lot of people are saying that’s the secure system that they really want. Then they want to know about the rest of the chip, so we added the secure fabric, and then we went ahead and put the hardware virtualization into the GPU. Without that you can’t really succeed in automotive. But more and more, because people want these devices to be secure, you’re going to see virtualization used here as well.
VB: I think Nvidia’s solution was two different computers in the car, right? An autopilot computer and an infotainment computer.
McGuinness: Right. That’s what a lot of people are doing now. But who wants to pay for it?