The folks at Nvidia, the world’s largest maker of stand-alone graphics chips, are good at numbers. So I talked to them recently about their expectations for virtual reality gaming. They showed me some demos of the Oculus Rift and HTC Vive virtual reality headsets, powered by a PC using Nvidia’s GeForce GTX Titan X graphics chip. The demos were just a taste of the many VR experiences we’ll see at the 2016 International CES, the big tech trade show in Las Vegas next week.
And they said that it takes graphics processors that are about seven times more powerful to run VR, compared to a standard PC game. By next year, when the first major PC-based VR headsets ship, there will be about 13 million PCs in the market that will be powerful enough to run VR — in the right way. Nvidia says it can extend that number to 25 million if the VR game makers use Nvidia’s Gameworks VR software, which makes the VR processing more efficient.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1857103,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"C"}']That shows that the market for VR won’t be gigantic in 2016, at least on the PC. Sony is also launching PlayStation VR on the PS4 video game console. Sony’s installed base of 30 million PS4 units means that its market potential will be bigger than the PC’s. But Nvidia notes that VR on the PC will be more demanding in terms of graphics processing required.
We recently interviewed Jason Paul, general manager of Nvidia’s Shield, gaming, and VR business. Paul said Nvidia is in contact with more than 600 companies that are working on VR projects. That’s a huge VR ecosystem, and Nvidia wants to be one of the leaders that makes it all happen.
Here’s an edited transcript of our conversation.
GamesBeat: We saw that there were 234 companies that one of our services was tracking.
Jason Paul: We have more than 600 companies in our tracker now. There’s probably one or two thousand at this point, globally.
GamesBeat: Is that projects or actual startups?
Paul: Startups and companies. It includes Fortune 500 companies that are doing VR as well. But there’s a massive amount of interest.
GamesBeat: I’m starting to get a handle on the larger picture. Seattle, the bay area, and Los Angeles seem to have the concentrations. New York as well. But it’s happening everywhere.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1857103,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"C"}']
Paul: A lot of European-based companies. We’re seeing a bit of activity in China as well, a number of headset guys building headsets.
GamesBeat: I’m waiting to sort it all out before I take meetings with everyone. But we’ll be seeing a lot of them at CES. Virtuix was apparently ready a year ago.
Paul: I have one on order. Still waiting. But apparently they’re shipping their first units to the Kickstarter.
GamesBeat: They were building them on the assumption that the headsets would be out by now.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1857103,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"C"}']
Paul: A few people are probably waiting on Oculus and HTC, yeah. One or two. But it’s getting close. Our perspective on VR is that it’s a new computing paradigm, like PC or mobile cloud. There will be a lot of energy around games, but it’s going to be much bigger than games. The apps and experiences that come out are going to change the way we do business, the way we interact socially with people, how we enjoy and consume entertainment.
Things like Tilt Brush, Everest VR as an educational and experiential tool, some of the news and storytelling projects, like gaining the perspective of Syrian refugees. We’re looking at VR with gaming at the core of it, but also in our professional spaces as well, other social and consumer apps.
GamesBeat: I feel like if someone gets one of these strong art tools out there, it could be great.
Paul: Have you see the Oculus art app, Medium? They have one with their touch controllers. It’ll be really interesting to see people go crazy with sculpture and new forms of art.
[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":1857103,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"C"}']
We see the potential of VR as very large, but we also see a big challenge as far as the computing power that’s required. If you look at your typical PC gaming experience, 90 percent of the gamers out there play at 1080p. For a smooth experience you don’t want to go below 30fps. Compare that to VR where the displays are about 2K, but you have to render closer to 3K, and you don’t want to go below 90fps. It’s about a sevenfold increase in raw performance to render for VR versus traditional PC gaming. You have to do that in less than 20 milliseconds from head rotation to what shows up on your display.
GamesBeat: What is Nvidia doing in VR?
Paul: Nvidia is doing three core things to solve this problem. First, we’re building fast GPUs, and we’re building them specifically architected for VR. Our Maxwell architecture has specific capabilities and features that make it very fast for VR. We have some technology that increases performance by up to 50 percent for VR applications.
Software-wise we’re making sure the out-of-box experience for customers is perfect. We want the first VR experience everyone gets when the headsets come out to be a good one – no stutter, no lag. Our GeForce Experience software and our Game Ready drivers are core to making sure that experience is delivered.
[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":1857103,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"C"}']
We’re working with all of the VR ecosystem through an SDK we call Gameworks VR on the consumer side and Designworks VR on the professional side. That helps headset manufacturers get lower latency and plug-and-play compatibility, and it helps developers get better performance out of their apps.
We’re engaged in this across all of our businesses. GeForce for consumer and gaming PCs, Quadra for professional visualization — theme parks, entertainment centers, location-based installations of VR — and Tegra on the mobile side. On the mobile side there’s a couple of headsets. A company called Game Face Labs down in L.A. has integrated a Tegra processor into its VR headset. Another company in Silicon Valley called Atheer Labs is building an AR headset with Tegra in a hip pack.
Our GPUs are being used on the graphics side to power the headsets, but they’re also being used on the capture side to stitch 360 video. Companies like Video Stitch are building GPU-accelerated stitching platforms to take images from three to 24 cameras and piece it together into 360 panoramas using GPUs. And there’s a bunch of activity going on in tracking and machine learning around GPUs.
GamesBeat: With the 360 videos, is there anything you participate in there?
[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":1857103,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"C"}']
Paul: With a platform like Video Stitch, our GPUs take in all the video feeds from the individual cameras and piece it together. They do all the work to output that single 360 video file. Most of the video stitching that happens is GPU accelerated.
GamesBeat: So what does Nvidia’s SDK do?
Paul: Yeah. We work with a lot of the video stitching companies. They’re programming it In Cuda to use the GPU as a stitching platform. Then we have tools in our SDK, like GPU Direct for video. It’s a low-latency way to get video in and out of the GPU. We have some SDK support around that to help people do better 360 video.
We have SDK features for app developers that help them get better performance and scale across multiple GPUs. For headsets, context priority helps them switch faster between tasks to cut latency. Direct mode gives them a direct interface into the GPU to provide better plug-and-play compatibility. We have a bunch of professional features that help people build VR installations — caves, immersive environments.
One of our main features is called multi-res shading. It uses our architecture to accelerate VR rendering. It helps to understand a little bit of how VR rendering works. If you take a phone, this is basically a VR display. With Samsung Gear VR it’s literally a VR display, but the displays in the PC headsets are basically cell phone displays as well. If you were to take it and hold it to your face, though, you can’t really see anything. Your eye can’t focus on the display when it’s close to your face.
The headset guys use lenses. You view the display through a lens and it helps your eye focus. It also gives you depth and field of view. The issue with a lens, though, is that it distorts the image. When you look through this grid, you see this pincushion effect, warping what’s underneath it. The GPU accommodates for that by pre-distorting an image, so when you view it under the lens it looks normal. The distortion meets reverse distortion, and your eye can view an image that’s very close to your face, but with good depth. Your eye can focus on the image. That’s how VR optics works.
The challenge with that is that GPUs today don’t render these distorted images natively. They render your typical PC display, rectilinear image. Then they have to a post-processing pass that distorts the image. Here we’re rendering a 3K image in a rectangle shape and then distorting it as a post-processing pass. That black area is pixels that get thrown away in that pass.
If you look at the way the optics work, the center of the display basically has a one-to-one mapping. This pixel here is in the same location there. But the outside periphery is very squished down. All the pixels on the outside here get scaled pretty heavily. If we could render the image in a way that wasn’t a one-to-one scale, we can save some of the rendering workload.
In our architecture we’re able to divide the image into nine regions and scale each region based on the needed pixel density in that region. On the periphery we don’t need as many pixels. It’s going to get distorted down anyway. We can draw this image as nine different images very quickly, as if it were one image. That allows us to draw something a bit closer to the final distorted image and save drawing all these pixels.
On most architectures this would be decelerator. It would slow things down because you have to draw nine different viewpoints. But on our architecture we get about a 50 percent speed-up while maintaining image quality. We’re working with Unreal and some other game and app developers to integrate this technology and deliver much better performance for virtual reality.
GamesBeat: How much will this all cost?
Paul: When you look at what’s needed for VR ready PCs, that starts around the GTX 970, our $300-ish graphics card. We’ve been working with OEM partners to build PCs based on this starting at around $1000. If you’re a PC gamer now, upgrade for $300 to become VR ready. If you’re new to the space, you can start at around $1000 for a PC. On the notebook side, back in September we announced our first VR ready notebooks. These are the only notebooks on the market that are capable of driving an Oculus or HTC experience. A number of them just started shipping last month from guys like MSI and Clevo. These aren’t thin and light notebooks, but you can put them in a backpack and take your VR experience with you.
VR is very demanding. It requires a fairly beefy card. But there’s still going to be a pretty good installed base in the next year and the years to come. This is our forecast looking out over time at how many VR ready PCs will be out there, just taking Moore’s Law out over time along with our installed base. Just based on our normal PC gaming business, we’ll be able to grow the installed base and grow a very healthy ecosystem and installed base of PCs that people can attach VR experiences to.
With technologies like multi-res shading, they can serve to effectively expand that installed base. They bring more performance to the same class of hardware. Developers can get more installed base for their games to run on or crank up their visual effects and deliver a higher quality experience. Whether we’re talking about working with headset guys like Oculus and HTC, or app developers and engine developers like Unreal and CCP, or even the 360 capture solutions like Video Stitch and the tracking and input guys, we’re working with the entire ecosystem to help them build better VR experiences and make them a reality.
GamesBeat: It seems like a smaller installed base than I might have thought would be available next year. Does your baseline match what Oculus has predicted?
Paul: Our 970-class products started shipping toward the end of last year. That’s the recommended level of performance for Oculus as well. It’s aligned with their expectations. In terms of what we’ll likely see in the market next year, it should be a healthy PC installed base for the headsets that we’re building. Obviously it will grow as we deliver capable graphics into more and more PCs.
GamesBeat: Has anybody made any good forecasts yet?
Paul: I probably have a half-dozen research reports that have forecasts for next year to the end of the decade. Some guys have tried to put numbers behind it. I don’t think we’ll speculate as much around the number of headsets. But we have more visibility into how many PCs will be capable.
GamesBeat: Sony’s been touting their 30 million number as far as their installed base, 30 million PS4s. But they haven’t announced a price for their headset yet.
Paul: The PC market sets a higher bar in terms of performance and quality level. The resolution is higher. The frame rate is higher. And the installed base will catch up pretty quickly.
GamesBeat: Are you guys assuming a certain number of brand new sales to get to those 13 million and 25 million numbers (the number of PCs capable of running immersive VR)?
Paul: This is normal PC business. I expect VR will help accelerate growth in the high end of the market.
GamesBeat: Is that Nvidia only or is that the whole market?
Paul: This is Nvidia. But in this space Nvidia has 85 to 90 percent market share. It’s close to the whole market. What do you think about VR adoption?
GamesBeat: I would have thought it was the other way around as far as the available market goes, that it would be much larger on the PC. Your numbers are smaller than Sony’s available market.
Paul: Yeah. But again, it’s a higher bar. Certainly in terms of the number of PCs that have PS4-class performance, it would be much more. The bar on the PC side has been set at a level that’s going to ensure a high-quality experience.
GamesBeat: Do you see many of these experiences porting directly from Samsung all the way up to PC?
Paul: Some stuff will come over from Gear VR to Oculus and the like.
GamesBeat: But I wonder how much work developers will have to do to go from platform to platform.
Paul: It’ll depend on what engine they’re building on. Some engines are really good about pushing to multiple platforms. Using Unity, Unreal, you can build one game and push it to multiple platforms without a ton of work. It just depends on if they’ve built it on an engine that has that capability, or if they’ve just done their own thing.