Nvidia does both software and hardware for VR.

Above: Nvidia does both software and hardware for VR.

Image Credit: Nvidia

GamesBeat: With the 360 videos, is there anything you participate in there?

Paul: With a platform like Video Stitch, our GPUs take in all the video feeds from the individual cameras and piece it together. They do all the work to output that single 360 video file. Most of the video stitching that happens is GPU accelerated.

GamesBeat: So what does Nvidia’s SDK do?

Paul: Yeah. We work with a lot of the video stitching companies. They’re programming it In Cuda to use the GPU as a stitching platform. Then we have tools in our SDK, like GPU Direct for video. It’s a low-latency way to get video in and out of the GPU. We have some SDK support around that to help people do better 360 video.

We have SDK features for app developers that help them get better performance and scale across multiple GPUs. For headsets, context priority helps them switch faster between tasks to cut latency. Direct mode gives them a direct interface into the GPU to provide better plug-and-play compatibility. We have a bunch of professional features that help people build VR installations — caves, immersive environments.

One of our main features is called multi-res shading. It uses our architecture to accelerate VR rendering. It helps to understand a little bit of how VR rendering works. If you take a phone, this is basically a VR display. With Samsung Gear VR it’s literally a VR display, but the displays in the PC headsets are basically cell phone displays as well. If you were to take it and hold it to your face, though, you can’t really see anything. Your eye can’t focus on the display when it’s close to your face.

The headset guys use lenses. You view the display through a lens and it helps your eye focus. It also gives you depth and field of view. The issue with a lens, though, is that it distorts the image. When you look through this grid, you see this pincushion effect, warping what’s underneath it. The GPU accommodates for that by pre-distorting an image, so when you view it under the lens it looks normal. The distortion meets reverse distortion, and your eye can view an image that’s very close to your face, but with good depth. Your eye can focus on the image. That’s how VR optics works.

The challenge with that is that GPUs today don’t render these distorted images natively. They render your typical PC display, rectilinear image. Then they have to a post-processing pass that distorts the image. Here we’re rendering a 3K image in a rectangle shape and then distorting it as a post-processing pass. That black area is pixels that get thrown away in that pass.

If you look at the way the optics work, the center of the display basically has a one-to-one mapping. This pixel here is in the same location there. But the outside periphery is very squished down. All the pixels on the outside here get scaled pretty heavily. If we could render the image in a way that wasn’t a one-to-one scale, we can save some of the rendering workload.

In our architecture we’re able to divide the image into nine regions and scale each region based on the needed pixel density in that region. On the periphery we don’t need as many pixels. It’s going to get distorted down anyway. We can draw this image as nine different images very quickly, as if it were one image. That allows us to draw something a bit closer to the final distorted image and save drawing all these pixels.

Multi-res shading is one way to reduce the amount of processing required for VR.

Above: Multi-res shading is one way to reduce the amount of processing required for VR.

Image Credit: Nvidia

On most architectures this would be decelerator. It would slow things down because you have to draw nine different viewpoints. But on our architecture we get about a 50 percent speed-up while maintaining image quality. We’re working with Unreal and some other game and app developers to integrate this technology and deliver much better performance for virtual reality.

GamesBeat: How much will this all cost?

Paul: When you look at what’s needed for VR ready PCs, that starts around the GTX 970, our $300-ish graphics card. We’ve been working with OEM partners to build PCs based on this starting at around $1000. If you’re a PC gamer now, upgrade for $300 to become VR ready. If you’re new to the space, you can start at around $1000 for a PC. On the notebook side, back in September we announced our first VR ready notebooks. These are the only notebooks on the market that are capable of driving an Oculus or HTC experience. A number of them just started shipping last month from guys like MSI and Clevo. These aren’t thin and light notebooks, but you can put them in a backpack and take your VR experience with you.

VR is very demanding. It requires a fairly beefy card. But there’s still going to be a pretty good installed base in the next year and the years to come. This is our forecast looking out over time at how many VR ready PCs will be out there, just taking Moore’s Law out over time along with our installed base. Just based on our normal PC gaming business, we’ll be able to grow the installed base and grow a very healthy ecosystem and installed base of PCs that people can attach VR experiences to.

With technologies like multi-res shading, they can serve to effectively expand that installed base. They bring more performance to the same class of hardware. Developers can get more installed base for their games to run on or crank up their visual effects and deliver a higher quality experience. Whether we’re talking about working with headset guys like Oculus and HTC, or app developers and engine developers like Unreal and CCP, or even the 360 capture solutions like Video Stitch and the tracking and input guys, we’re working with the entire ecosystem to help them build better VR experiences and make them a reality.

GamesBeat: It seems like a smaller installed base than I might have thought would be available next year. Does your baseline match what Oculus has predicted?

Paul: Our 970-class products started shipping toward the end of last year. That’s the recommended level of performance for Oculus as well. It’s aligned with their expectations. In terms of what we’ll likely see in the market next year, it should be a healthy PC installed base for the headsets that we’re building. Obviously it will grow as we deliver capable graphics into more and more PCs.

GamesBeat: Has anybody made any good forecasts yet?

Paul: I probably have a half-dozen research reports that have forecasts for next year to the end of the decade. Some guys have tried to put numbers behind it. I don’t think we’ll speculate as much around the number of headsets. But we have more visibility into how many PCs will be capable.

Nvidia's Gameworks VR enables more PCs to run VR.

Above: Nvidia’s Gameworks VR enables more PCs to run VR.

Image Credit: Nvidia

GamesBeat: Sony’s been touting their 30 million number as far as their installed base, 30 million PS4s. But they haven’t announced a price for their headset yet.

Paul: The PC market sets a higher bar in terms of performance and quality level. The resolution is higher. The frame rate is higher. And the installed base will catch up pretty quickly.

GamesBeat: Are you guys assuming a certain number of brand new sales to get to those 13 million and 25 million numbers (the number of PCs capable of running immersive VR)?

Paul: This is normal PC business. I expect VR will help accelerate growth in the high end of the market.

GamesBeat: Is that Nvidia only or is that the whole market?

Paul: This is Nvidia. But in this space Nvidia has 85 to 90 percent market share. It’s close to the whole market. What do you think about VR adoption?

GamesBeat: I would have thought it was the other way around as far as the available market goes, that it would be much larger on the PC. Your numbers are smaller than Sony’s available market.

Paul: Yeah. But again, it’s a higher bar. Certainly in terms of the number of PCs that have PS4-class performance, it would be much more. The bar on the PC side has been set at a level that’s going to ensure a high-quality experience.

GamesBeat: Do you see many of these experiences porting directly from Samsung all the way up to PC?

Paul: Some stuff will come over from Gear VR to Oculus and the like.

GamesBeat: But I wonder how much work developers will have to do to go from platform to platform.

Paul: It’ll depend on what engine they’re building on. Some engines are really good about pushing to multiple platforms. Using Unity, Unreal, you can build one game and push it to multiple platforms without a ton of work. It just depends on if they’ve built it on an engine that has that capability, or if they’ve just done their own thing.

An Oculus demo at the VRX event in San Francisco.

Above: An Oculus demo at the VRX event in San Francisco.

Image Credit: Dean Takahashi