VB: I saw something from Osterhout Design Group recently. They’re one of the military designers, doing night-vision goggles and stuff. They just announced a consumer version coming out this year that’ll be under $1,000. It’s a regular pair of glasses with screens on the lenses that do overlays and other sorts of things while you’re still able to see everything.
McGuinness: We work really closely with the people at Oculus. John Carmack is a big fan of ours. We help them a lot with the graphics for the Rift. The major thing in making it usable — you’ve heard of the effect where, if you have full VR and there’s too much lag between your head movement and the display reacting, it makes you nauseous. Reducing that lag between the action detected by the sensor and the reaction on the screen is critical. John wanted to chase the raster, actually, to not even have a single buffer. That reduces it from milliseconds to microseconds of delay, which is what you need.
In terms of graphics, though, the thing about augmented reality is that you want minimum graphics. You’re not creating the entire world. If you put too many graphics on it, you destroy the information. You want the minimum of information on there.
The exception is if it’s something like what Metaio are doing, where you look through a screen and can take an object and locate it in the real world. In that case you want really good graphics. You want to probe the light sources in the environment, take that object, and light it with the light sources that are actually in the environment. It’s quite a hard problem. That’s one of the directions we’re going to take with ray tracing. Ray tracing makes that happen in real time. If you want things properly lit and casting shadows and reflections happening from the environment, doing it with ray tracing is the only way of possibly doing it in real time.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
You’ll see that in point-of-sale and things like e-catalogs. Ikea is already doing this. You can get an Ikea catalog and, with your iPhone, if you look through the screen at the catalog, it’ll detect the QR code or something else in the page there and do something. In some cases it’ll run a movie. In other cases the object you’re thinking about buying will pop up in 3D. You can take that in all sorts of directions.
VB: Are you guys into wearables, like watches?
McGuinness: We are, actually. There’s one particular company called Inada, which is using MIPS and PowerVR exclusively in its IOT range. We have SGX in the watch. We’ll be seeing those soon. Those watches are fantastic because they have a battery life of six days. That’s pretty impressive in a smartwatch. It’s the longest battery life I’ve ever seen. MIPS is one reason for that.
ARM’s dirty little secret is that they’re not a low-power company. Everyone says, “Hey, ARM are the low-power kings!” Well, have you ever seen the power consumption of the A15? The MIPS architecture is ultimately more efficient than ARM.
The other story — the theme of the moment, if you like — is security. Everyone’s very concerned about security. On the MIPS side of things, it has hardware virtualization. We can create multiple secure zones. We’ve also done that in the GPU, so now we have hardware virtualized GPUs, and we also have a secure fabric. We have a complete CPU, GPU, and fabric system which is completely secure.
It’s more secure than TrustZone. With TrustZone you can have a single trusted zone, which means that all your trusted apps have to go into that one zone. A lot of people don’t want that to happen. Netflix does not want to be in the same trusted zone as anything else. With MIPS, because of the hardware virtualization, you can create trusted zones that are separate from each other, isolate all of your different apps, and keep them secure. That’s a differentiating factor between MIPS and ARM.
A lot of people are saying that’s the secure system that they really want. Then they want to know about the rest of the chip, so we added the secure fabric, and then we went ahead and put the hardware virtualization into the GPU. Without that you can’t really succeed in automotive. But more and more, because people want these devices to be secure, you’re going to see virtualization used here as well.
VB: I think Nvidia’s solution was two different computers in the car, right? An autopilot computer and an infotainment computer.
McGuinness: Right. That’s what a lot of people are doing now. But who wants to pay for it?
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More