BELLEVUE, Wash. — I traveled here to get a download on the latest in virtual reality. Valve‘s SteamVR system is a hardware and software system for displaying virtual reality applications, where you look into stereoscopic goggles and feel like you’re immersed inside a virtual world. After seeing a lot of virtual reality demos over the past couple of years, I was impressed. Valve’s system, which is expected to hit the market in the fall as HTC ships its Vive headset and other hardware, didn’t make me sick.
That’s always a plus when you don’t throw up. The imagery of the simulated world keeps up with the movements you make. The precision of the system — designed by Valve and implemented by HTC — is purposely designed to minimize motion sickness, said Jeep Barnett, a game programmer at Valve who ran my demo. And the quality looks so good in the room-scale, 360-degree view experience that it remains the technology to beat in virtual reality.
Valve remains one of the most interesting companies in the game industry, as it controls its own destiny. Founded by former Microsoft employees Gabe Newell and Mike Harrington, the company started out in 1996 as a game developer, and its first game, Half-Life, was a huge hit when it debuted in 1998. It produced hit after hit, and along the way, it created the Steam digital distribution service on the PC. Now Steam has turned Valve into a digital game distribution giant. Now it sees itself as providing some much needed openness and independence in gaming, providing technologies that many companies can use. Its rivals like Microsoft or Facebook’s Oculus are more inclined to support proprietary technologies, and that’s why Valve is moving forward with open projects such as SteamVR.
SteamVR is very different from the better-known Oculus VR headset. SteamVR uses timing sequences in a PC along with the laser light from spinning lasers in two separate Lighthouse base stations (or laser boxes that are plugged into outlets and are placed in two different places in a room). These lasers hit the different sensors that are embedded in the headset. Depending on the timing of when the lasers hit the sensors, the PC figures out the position and angle you are facing with the headset.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
The PC then shows you the view that you should be seeing, given the position of where you are facing. The lasers are not acting like Kinect cameras in the Microsoft Xbox One and Xbox 360 game consoles. Those systems send light signals out and then capture them as they bounce off of objects and return to the camera. That kind of system takes an extreme amount of processing power to track multiple objects. It gets exponentially harder the more objects you add.
Getting ready to wear the headset
By contrast, the Valve system doesn’t measure anything bouncing back. The sensors that are struck by the laser light send signals back to the PC over a wire (which is why, for now, the headset is wired). The PC then calculates the 3D space around you.
“The way the system is designed is very performant,” said Barnett, in an interview with GamesBeat. “On the same PC, you can track hundreds of objects, all very easily.”
The number of objects you can track are independent. You can have four or five people wearing headsets and be in the same area with just two base stations tracking all of the headsets.
“You just want coverage of the room from every direction,” Barnett said.
The headset fit right over my glasses. I pulled back on the strap, put my face into the mask, and then lowered the strap around my head. I snapped on a belt to keep the wires in the right place behind me. There were about five wires, but that will be simplified to one wire in the final version.
Then Barnett put two wired controllers in my hands. They were like vertical sticks, with trackpad buttons on top, more buttons around the grip, and a forefinger trigger in the front. I pressed the trackpad button with my thumb. I squeezed the grip with my fingers to activate the grip buttons.
“This isn’t the final design we will ship, but it’s pretty close to what we will have for the dev kits that we will ship next month,” Barnett said. “The controllers that ship will be wireless, while these ones are obviously wired. When the controllers are wireless, the belt isn’t necessary.”
The last step in preparations was to add an audio headset on my head. The final version of the goggles will have headsets built into them. The audio uses 3D positional audio, so you can tell what direction the sounds are coming from.
I stood up and was able to walk around. I looked down and saw blue lines on a grid at my feet. That was the safe area where I could walk. If I walked toward a wall, a blue grid of vertical lines appeared to warn me that I was about to hit the wall. If I walk away from the walls, they fade away. That was a built-in safety mechanism.
“You don’t have to know where the walls are in your conscious mind,” Barnett said.
You can specify what the safe area will be for any room in your home. Games created for Steam VR will recognize where the safety area is, and configure the virtual environment to match it.
Hands-on with Valve’s VR demos
The demo started with The BluVR: Encounter, an underwater ocean simulator. I was standing on the deck of a sunken ship. The deck area matched the part of the floor where I could walk safely without running into a wall. So it was smart to map the walkable area of the simulation with the same space where I could walk safely without tripping over anything. I saw fish swimming in every direction, as well as manta rays. But the manta rays weren’t animated perfectly. I saw ghosting (or double images) around the rays. Then I turned in another direction and saw a giant whale swimming next to me. It was a huge whale, and it only seemed that way because I was standing so close to it. The demo was created by Wevr.
In the next demo, I saw the Job Simulator, which was a cooking demo set in in a kitchen. A robot named Job Bot walked me through the prep. I had to read the recipe on the screen and then use the hand controllers to make tomato soup. I had to pick up a couple of tomatoes, put them in a pot, and then grab some mushrooms and put them into the pot as well. I had to shake some salt into the pot, and then turn it on. The demo was made by Owlchemy Labs.
Then I saw a non-interactive tabletop wargaming simulation.
Steel Wool Games created a tabletop battle, dubbed Quar, where a bunch of creatures were attacking a fortress. I was planted right in the middle of the battlefield. I could watch the soldiers attack while defenders in trenches fired back with rifles. I saw the attackers slowly get hit, one by one. The visuals were pretty cool, and the battle unfolded before me. Eventually, the attackers on foot and mounted creatures were all killed, and the demo ended.
Next, I looked at Tilt Brush, a 3D sketching and painting demo from Skillman & Hackett. I used my hand controllers to pick up paintbrushes and start coloring the 3D space with neon lights. I could rotate the images to see them from any direction and scribble away. I made a funny happy face with a bunch of different colors. It was a lot of fun, though I didn’t create anything worth saving.
Google also created a 3D-animated version of Google Earth in VR. I saw the Matterhorn in all of its 3D glory, as well as downtown San Francisco, and the stony walls of Bryce Canyon. It didn’t have any interactions, but it paged in live data. That included data on the mountains and how high they were. The images looked good, but I couldn’t do anything with them. But there was a lot of detail. I found VentureBeat’s offices in San Francisco.
“This uses the live data set,” Barnett said. “If I unplug the Internet, this will disappear.”
Valve also created a demo of Aperture Laboratories, the fictional research lab from the Portal series of games. In the demo, GLaDOS, the homicidal artifiicial intelligence system, teases the test subject (me) and instructs me on a comical way how to repair a robot. GLaDOS gave me instructions on what to do, but then started spouting the directions out far too fast for any human to follow. I reached out and pressed buttons, or spun the robots exploded view components around.
There were a few other apps for surgery simulation, fishing, and one called Sky World. They weren’t updated for the latest controller, so I skipped them.
Why you don’t get seasick
The stereoscopic headset was showing me images in each eye at a rate of 90 frames per second, or 90 hertz. I felt like I was inside a 3D world, an ocean surrounded by blue water. The main challenge of making sure you don’t get sick is that you need a high frame rate and accurate tracking. The system has submillimeter accuracy on location because of the data coming in from the base station and the sensors. It takes just 17 milliseconds to render a frame, far less time than is noticeable by the human eye. The gyroscopes in the headset predict your future head direction and then predicts which images you should see.
“The system allows us to get extremely accurate information that is easy to process,” Barnett said. “The heavy lifting is in the rendering of pixels, which depends on how many pixels we are pushing into the display.”
The number of pixels is 1,200 x 1,080 pixels per eye. That requires an Nvidia GeForce GTX 980 graphics card, which in a few months will be a mid-range machine. A phone (like Samsung’s Galaxy Note 4 with the Samsung Gear VR headset) can’t process this kind of data quickly enough yet. The organic light emitting diode (OLED) displays inside the goggles have to be “low persistence,” meaning they show you what you need to see so that you don’t get dizzy from motion blur. You don’t want objects to bend or smear when you turn your head. These displays have to have high refresh rates and display the pixels all at once.
The lenses in the goggles account for the curvature of your eye. If you are fixated on a single point and you rotate your head, the world doesn’t seem like you’re underwater, and it looks normal.
“We found that tracking was really hard until you get a good solution,” Barnett said.
Conclusion
The result of all of Valve’s work and attention to detail is a remarkably accurate and seasick-free VR system. Valve has been working on the system for four years, and HTC expects to ship the first products in November 2015. Dev kits are shipping in a short time, and that will lead to more apps.
We’ve certainly waited for a good experience in VR for a long time, and I think that Valve has finally created it. I’m looking forward to seeing the final hardware and more cool applications.
The leap from wired to wireless will be very difficult, as wireless systems have limited bandwidth and have the challenges of operating on battery power. The battery with current technology might last just 30 minutes.
“With the pixels comes heat and battery power,” Barnett said. “As soon as you put a cooling system on the headset, like a fan, it makes noise and rattles, and that causes problems.”
Over time, you might be able to operate a wireless headset with a lot of battery power.
“Certainly not this year though,” Barnett said.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More