We’ve been the brains for our cars for a very long time. But graphics chip maker Nvidia believes it’s time cars started thinking for themselves. That’s why it came up with two new car computers that use its new Tegra X1 mobile superchip.
We weren’t so sure why our cars had to have two different computers that are smarter than the ones we have in our homes or our smartphones. But David Anderson, senior automotive solutions architect at Nvidia’s automotive business unit, helped us understand.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1639773,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"C"}']We caught up with him at the recent 2015 International CES, the big tech trade show in Las Vegas, where Nvidia unveiled the Nvidia Drive CX, a digital cockpit computer for the car, and the Nvidia Drive PX, which can control a car’s autopilot.
The two models, he said, both have to be very powerful and both have to be updatable and connected to the Internet. After all, if a hacker breaks the security of one computer and its network, you don’t want them to be able to compromise the security of the system that controls the vehicle’s safety technology.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Here’s an edited transcript of our interview with Anderson.
VentureBeat: Why do you have two different computers in a car — one for safety, one for entertainment?
Dave Anderson: Right. The reason for the separation of Drive CX and Drive PX is to create two fundamental computers for the car. Drive CX is focused on the user experience of the digital cockpit. That would include everything from infotainment to reconfigurable displays for your digital dashboard. Drive PX is focused specifically on the semi-autonomous driver assistance and the fully autonomous features that will eventually come to vehicles. It was designed in a way that architecturally makes sense for most automotive applications, most automotive OEM implementations today.
VentureBeat: At the Black Hat security conference, a few hackers demonstrated gleefully that they could hack car computers. They pointed to all the different cars they could find where there was too much commonality or networking between safety and entertainment. If you hacked one you could access the other. If you bring the Internet into the car, maybe you’re bringing hackers in, and you have to think about that.
Anderson: OEMs are evolving strategies around what their next-generation electrical infrastructure is going to be. These kinds of concerns are going to change the way we approach things. We’ve learned a lot in other product areas that we can apply to ensure a secure experience in the vehicle. That’s another piece we’ll bring as we continue to develop these applications.
VentureBeat: It seems like there’s a lot of asset-sharing you can do. Someone said that backup camera displays are being driven by the government — they want more of those in cars. But that gives people a reason to have a big display in the car that can in turn be used for entertainment. The asset is there for a safety reason, but it can have many purposes.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1639773,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"C"}']
Anderson: Government-mandated standards around cameras, or even vehicle-to-vehicle communication, are pushing forward more opportunities for larger displays and advanced applications like we’re talking about in the vehicle. That’s exciting. One of the big things that has to happen for a lot of the semi-autonomous or fully autonomous functionalities is major infrastructure change. The only way infrastructure change is going to happen is if the government takes an active role in pushing that ahead.
VentureBeat: What are some of the concept cars you’re showing off?
Anderson: One is actually a production car. It’s a brand-new Audi TT, featuring a whole digital cockpit. It’s powered by Nvidia. That’s a fully reconfigurable display. It not only shows vehicle information like your speedometer and tachometer, but also has all the integrated media control functionality and navigation in a central spot that’s more accessible for the driver. It’s one of the first examples of this idea of a true digital cockpit. It’s also one of the first examples where they’ve completely removed the display from the center stack. It gets past this idea that you have to have two discrete systems.
VentureBeat: The look of the display is up to the user? Can they decide what kind of display they want to look at?
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1639773,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"C"}']
Anderson: The user doesn’t have control over all of the settings, but they have the ability to change what information is presented. They can choose how they’d like their gauge positions to be. They can choose to look at navigation or not. They can even choose to read email coming in — that is, the computer itself uses text-to-speech to read you an inbound email. It’s really exciting, this connected car experience.
VentureBeat: And what is the other car?
Anderson: The other car is made by a company called Renovo. That’s a small company in Campbell, California. They’re creating a limited production run of vehicles, a very high-end fully electric sports car. Inside the Renovo supercar we’ve created a new gentrified look and feel for both the digital cluster and the infotainment platform in the vehicle, all powered by our Drive CX platform.
VentureBeat: Will the car market be a large one soon for everybody in the chip space?
[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":1639773,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"C"}']
Anderson: Absolutely. We’ve already shipped 7.5 million units into the field. 7.5 million vehicles on the road today have Nvidia technology. That’s projected to grow in the future. Before not too long, we could see upwards of 25 million projected in the future.
VentureBeat: You spent a lot of time at the press conference showing the neural network technology — self-parking and things like that. What was your goal there? Is it just that there’s a very large computing task awaiting us in the future?
Anderson: The core thing to underline with the deep learning discussion and the neural network discussion is the fact that our cars are becoming more advanced. They’re probably the most advanced mobile computers we own. The notion behind using deep learning in our Drive PX platform is that we’re trying to provide a greater level of situational awareness for the driver and for the car itself. The car, using deep learning and neural networks, can become a learning entity.
Imagine that your car, with a forward- or rear-facing camera, could start to recognize objects. And not only recognizing that it’s seeing an object, like a car in front of it, but perhaps it can even get so advanced as to see what type of car that is, and start to classify those objects. The neural network allows us to create the ability for the PX system to think and decide what it sees. We’ve trained that system with this particular capability.
[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":1639773,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"C"}']
This is an example over here on the wall. This is Drive PX running that deep learning algorithm for object recognition. Now it’s tracking and finding 14 different types of vehicles. You can see that not only is it finding something like a police car, but even as it’s passing other cars, it can identify what those cars are. The advantage of this technology over other existing fixed silicon solutions is that this can fundamentally change over time. Because it’s software — because it’s written to run universally on our Cuda-capable GPUs — this can be extensible for all in the future. This is the flexibility for what I think is the major advantage to the Nvidia system in the long term.
VentureBeat: What about the distinction between providing a chip and providing this whole system, bringing on all these software experts to do autopilots and all that?
Anderson: Nvidia’s approach to automotive is to look at the whole-stick solution. The reason for that is that, just like with a lot of other rapidly-advancing technology, the expertise required to make a lot of that only resides with the people who are actually developing the technology.
What we’ve found works successfully for a lot of our OEM engagements is for us to become a collaborative partner with the OEM — and with the tier one who’s going to fabricate the final product — in a way such that they can leverage all our software capability and hardware capability as a company. It’s not a necessary step to replace someone else’s role in the overall industry, but it’s something we can do to augment capabilities.
[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":1639773,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"C"}']
VentureBeat: Do you see a day when a chip that might go in a car would be different from a chip you’d make for gaming?
Anderson: We actually do that. The product we’re shipping into automotive is a product that we specifically designed for the automotive market. The distinction is clear in terms of what we announced with X1. We made a concerted effort to say that the X1 was being targeted for automotive applications, specifically because things like what we’re showing here absolutely require it. The CX platform running a combined user experience for the vehicle definitely needs X1.
VentureBeat: Are gamers going to want the X1? It has more cores.
Anderson: Absolutely. Everybody’s going to want X1. No question about that. But the reality is that we wanted to make a clear statement about how important automotive is for our company. This is a step to show that our vision is to enable the automotive market with as advanced a technology as it can have.
VentureBeat: Does that go so far as having different chips with different components on them for gaming versus cars? Or would you stick with one universal hardware that would be customizable through software?
Anderson: Like anything, the core of our technology will always be based around our GPUs. Just like we started with Tegra K1, the unification of the GPU architecture — like with Kepler across everything we make — is a core focus for what we’re going to do in the future. With X1 now containing Maxwell and Maxwell being part of everything we do from a GeForce perspective, it continues that same trend.
There are specific needs for the automotive market, though. We’re still working on creating specific solutions that would best meet those market demands. More than likely, we’ll have specific products for automotive. That’s the plan.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More