Sony is working on refining its Project Morpheus headset and the virtual reality applications that go with it. I had a good look at the demos at the recent Electronic Entertainment Expo (E3), the big gaming trade show in Los Angeles. Those demos tell us a lot about the progress the company is making toward the launch of the VR headset in 2016.
As you might expect, some parts of the demo were good and others were bad. Impulse Gear, a startup cofounded by MAG co-creator Seth Luisi, showed off a precise shooting demo that used a gun peripheral to steady your shots. That worked well, but I had a hard time with a mech demo dubbed Rigs, which made me nauseous. These demos show the progress that Sony is making on the big march to VR nirvana. Tim Merel, managing director of Digi-Capital, forecasts that virtual reality will grow to a $30 billion market by 2020, while augmented reality will be even bigger at $120 billion by 2020.
My experience reinforced the notion that each VR game is unique. One game may turn out to be a great experience, but another can have flaws that make it a poor experience. Getting it right isn’t easy, and that may explain why it’s taking so long to get Morpheus into the market.
Richard Marks is the director of Sony PlayStation’s Magic Labs research division, and he said in an interview with GamesBeat during the demo sessions that you have to build up a lot of experience with VR development to get it right. He said he is encouraged by the progress that Sony is making, both with hardware technology and the array of games and demos that are available for the headset.
“It’s neat to see such a wide spread of ideas,” Marks said. “What’s surprising is how many different things are fun. There’s not just one thing you want to do.”
Marks said that Sony’s Morpheus headset now has a 5.7-inch diagonal organic light-emitting display (OLED) that can show high-definition images with a resolution of 1920 x 1080. It expands the field of view beyond last year’s demos, so that each high has about 100 degrees of viewing. That gives you better peripheral vision.
The new display also enables low persistence, or the ability to remove frames where the user is moving and are blurry. The headset also refreshes the display at a rate of 120 frames per second, or fast enough to keep up with your head movements.
By removing those blurry frames and reducing lag time between when the user moves and the imagery updates, the low-persistence display is less likely to make the viewer seasick, which is a major problem with VR experiences.
The Morpheus headset will have a physical wire connecting you to the game console. You can wear any audio headphones by plugging into a jack. It also works with the existing Dual Shock 4 and Move controllers for tracking. Morpheus requires the regular PlayStation Eye camera for the PS4. The Move controllers and the camera help Morpheus detect where you are pointing your head and what you are doing with your hands.
Sony is still working on some other technologies. Marks has shown off eye-tracking demos before, but none of the Morpheus demos used that technology. Sony isn’t working on an augmented-reality technology to go with Morpheus either, in contrast to Microsoft’s HoloLens project. And if you want 3D positional audio, Sony will have to put that in a “breakout box.”
The first game that Sony showed me was a scene from The Getaway, a crime game where you’ve stolen a jewel and have to make off with it. You and your accomplice have to drive an escape van and elude attacks on motorcycles and cars that are chasing you.
In this new level called London Heist, an artificial intelligence companion is driving the van, and he told me to shoot at those chasing us. I was holding Move controllers in my hands. I had to load a gun by picking up clips on the dashboard. But I didn’t do that just by pressing a button, as you would in most games. Rather, I had to move my left hand to grab the clip, and then move it to my gun hand to insert the clip into the gun. This took me a couple of tries to get it right, as the Move controller isn’t that precise.
But once I got the hang of it, I found it easier to do. I looked to my left and emptied my submachine gun at a motorcycle assassin on my left, and then I had to shoot the gas tank of a pursuit car on my right. Once I ran out of ammo, I had to reach out on the dashboard again, jam a clip into the gun, and then look to my left or right to spot more enemies. This kind of gameplay was a lot more immersive than just shooting via a traditional PlayStation game controller.
“One thing that’s surprising to a lot of people is how much fun it is to do things with two hands separately,” Marks said. “Picking something up and doing something with the other hand to it—humans are really good with that. If you have to paint with just one hand and then move your body around it, it’s really hard, but if you can hold the object in the other hand and paint, it’s much more effective. You can move the paintbrush or move the object quickly.”
The problem with this demo was that it took a lot more practice in virtual reality to get the clip-loading technique down, compared to real life. Also, it was not that easy to aim the gun at the right target. The enemies, accordingly, were quite incompetent. So using Move as a control system for VR is not yet as accurate as it should be.
Rigs: Mechanized Combat League
Rigs is a futuristic multiplayer battle arena game where one team of three mechs squares off against another team of three. It is a game that is designed as an esport, with a cheering crowd and lots of team regalia. There’s a big dramatic buildup before the match starts, much like in The Hunger Games film. The graphics look good, even when viewed in virtual reality.
You emerge on the three-level playing field in your big mechanized walker. You can start moving forward and jumping, all the while searching for the enemy. Once you locate a rival, you can shoot with either lasers, rockets, or plasma cannon. You can jump into the air to gain a height advantage, or just sneak up behind an enemy before blasting them.
Once you shoot enough enemies, you can pick up items to go on the offense. You can move up the ramps or jump higher to get to the top level. Then you have to get up another ramp and jump into a hoop, like basketball. But the enemies can shoot you down while you’re making the attempt to score a goal. All of this happens in a matter of split seconds, and it feels a lot like the action in a pro basketball game. Since there are only three players on each team, every player makes a difference.
The game is being built by Guerrilla Games’ Cambridge studio. Best known for the Killzone series, Guerrilla Games knows how to build shooters. I found it pretty easy to play and score points, either by shooting rivals or jumping through the hoop.
But it was a little difficult to use my head to control the momentum of the mech (as your head is used as a form of control in addition to the controller). I often felt there was a mismatch between the motions caused by my head movement and the motion caused by the traditional controller in my hands. I’m not sure what it was, but after I finished the demo, I felt queasy. If I had to guess, I think the game had too much fast action and that the developer didn’t pay enough attention to any mismatch between what I was seeing on the screen and what was physically happening with my eyes.
Playroom VR
I had a good laugh playing this party game. It was an asynchronous game, where players could look at the TV using traditional game controllers. Meanwhile, I was wearing the Morpheus VR headset. I couldn’t see the TV screen, but I had my own unique view of the action. I played a sea monster, rising from the ocean depths. I emerged to see a lot of small people in a cartoon-like city in virtual reality. I shook my head to the left or right to knock down buildings with thunderous crashes. I moved along a predefined path through the city, while the humanoid inhabitants fled before me.
When I reached the end of the city, I faced the other four players making their stand at the water’s edge. They started tossing things at me. I had to dodge them and fire back. I took out a helicopter in the sky, but in the end their little cute humanoid characters overwhelmed me. It was a very noisy demo, but it proved the point that local multiplayer games can be a lot of fun, even if only one player has the Morpheus headset and everyone else on the couch has to watch. The title is in development at Sony’s Japan Studio.
Super Hypercube
You can think of this game as a form of Tetris in 3D. You have to manipulate a shape and get it into the right position, before a timer runs out. Except in this case, with the Morpheus demo, you are looking straight ahead in VR. You have to manipulate the 3D shape so that it fits in the hole ahead of you. It’s like getting a key into the right position before you move it into a door lock. It starts out easy, like fitting a cube into a cube-shaped hole. But then the shapes change and you have to rotate them. You have to look around the side in 3D to see what kind of shape will fit through the hole ahead of you.
The only trouble with this demo was that it was something that you could play with a regular controller. You could get a little more insight into the visual problem by moving your head around. But it happened so quickly that it was pretty hard to act on the additional point of view that was available via VR. Here’s a video trailer of the title below. It is being developed by Kokoromi and will be published by Polytron as one of the titles for Morpheus.
Impulse Gear’s gun prototype
Greg Koreman and Seth Luisi of Impulse Gear showed me a prototype for a VR game that tries to solve the problem of precise shooting. The team does so using the Morpheus headset with a move controller embedded in a plastic gun. The Move controller is tracked via the PlayStation Eye camera. And when I pulled the trigger on the gun, the same gun in VR fired at my target.
Here’s the cool part that really hasn’t been done well before: I was able to point the gun in one direction and shoot. Then, at the same time, I was able to independently move my head in a different direction to see something else in the VR landscape.
The landscape was a sci-fi setting, where I landed on a planet and had to take on not only enemy humans but various monsters as well, including some very large spiders. I fired away using the laser gunsight. That helped me get close to my targets, but I always shot a little bit too high. So I adjusted and learned to lower my aim. When I did that, I was able to take out lots of enemies at both close and long-distance ranges. But the prototype gun was missing cross hairs. That was the last thing I needed to get very precise with the shooting, since it was dark on the planet.
The Impulse Gear team created a 3D-printed prototype gun, based on existing Move peripherals, to go with the Move controller. Marks said that the final gun design is still under review, but the results are promising so far. I felt it was the best shooting experience I’ve seen in VR to date. I’m looking forward to seeing more about the kind of game that Impulse Gear is going to build around the prototype.
“We really focused on the controls so that a first-person shooter can really work on Morpheus,” Luisi told me. “We think it can be done right.”