It has taken me a while to review all of the cool things I saw at the Electronic Entertainment Expo (E3), the big game trade show in Los Angeles last month. But I would be remiss if I didn’t point out the cool tech gear that I saw at the show. I look at gaming from a tech point of view every year, and it always makes me feel good about the forward progress of both gadgetry and gameplay.
The consoles are certainly looking outstanding. The 3D animations of the newest blockbusters look spectacular. I’m particularly impressed with the realism of the human faces and bodies in upcoming games such as Sony’s Until Dawn, which is a horror game that debuts on the PlayStation 4 in August.
In Until Dawn, video game artists have created faces that look like the real thing. Supermassive Games uses motion capture and face capture technology from Cubic Motion that takes the acting performance of a real person and then overlays an animation on top of it.
The result is incredibly lifelike. The same goes for Naughty Dog’s depictions of characters like Nathan Drake, Sully, and Elena in Uncharted 4: A Thief’s End. Those high-quality animations are just so much more compelling than the previous generation of video games.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
Michael Mumbauer, director of visual arts at Sony’s North American game division, and Mark Sagar, director of the Laboratory for Animate Technologies at the University of Auckland in New Zealand, gave an excellent talk on the subject at the most recent Game Developers Conference. They showed the first 15 minutes or so of Sony’s outstanding 2013 of video game, The Last of Us. That title was one of my favorite of all time, in part because that beginning scene was so compelling and emotional. But Mumbauer and Sagar underscored the importance of the realistic human animation by converting the scene to an 8-bit rendition with text-based dialogue. It just wasn’t compelling anymore. And that shows just how important the best visual and voice quality are to a scene.
The awesome visuals in the upcoming games are going to set the bar very high for believability. The games I saw at E3 have reached a pinnacle of visual quality that I believe will keep gamers delighted during this generation of consoles. And it wasn’t just Sony that showed this powerful imagery off. So did Epic Games (Gears of War 4), Activision (Call of Duty: Black Ops III), Square Enix (Rise of the Tomb Raider), and others. I was even impressed with the graphics work that Electronic Arts did in creating realistic female bodies so that it could include women’s national team players in its upcoming FIFA 16 title.
Of course, there’s always a trade-off for realism. Sometimes, game makers use so much processing power to create a realistic face that they fail to create speedy movement to go with it. Bethesda’s Doom remake reminded me of this, as the first-person shooter was super fast, but it didn’t even try to render human faces in a realistic way.
The continuous improvement of human faces and body animations is why I keep coming back to new generations of existing franchises, even though they could otherwise be easily dismissed as unoriginal sequels. We have not yet reached the point of diminishing returns on 3D graphics for human faces.
But graphics technology isn’t the only thing that will keep the gears of gaming turning.
Virtual reality is going to start turning our heads starting next year when Facebook’s Oculus VR division launches its Oculus Rift VR headset and its Oculus Touch input system.
Oculus made VR seem a lot more fun and relevant to gamers through its clever use of demos that showed how there are some games that you can play only in VR. The Oculus Touch has sensors that detect your finger movements as well as button pushes. I cracked up during the demo as I picked up a bunch of objects with my virtual hands and threw them rapid-fire at some passing carnival targets.
You can hold one Oculus Touch controller in each hand, enabling you to independently control your hands inside a virtual world. The Touch sensors aren’t completely precise, but they’re good enough for you to have a good time. I had fun picking up a Ping-Pong ball and whacking it with a paddle. I also punched a punching bag and shot a bunch of ceramic statues. It really made me feel like I was part of a virtual world.
Sony will also launch its Morpheus VR headset for the PlayStation 4. It also had some “only in VR” demos where you used your body movements or hand gestures (with the Move controller) to make things happen in the VR world. But I thought that Oculus showed superior demos and better input technology. Sony’s Morpheus demos will be limited by the power of the PlayStation 4, while the Oculus Rift is tied to the ever-advancing visual capabilities of the PC.
Valve is likely to be a dark horse as well, since it is supplying some excellent VR technology as the foundation for HTC’s Vive VR headset coming this fall. Valve, however, didn’t show off much at the show. In contrast, Oculus showed off a lot of games, such as a fun hockey title.
We’re a long way from identifying a winner in the VR wars. And there are plenty of other competitors besides the ones I’ve mentioned. But I am impressed with the continuous progress that Oculus VR in particular has shown in making better demos that are increasingly less likely to make me feel motion sickness. VR is going to be immersive, and it’s going to deliver us unique gaming experiences that we won’t be able to get any other way. As long as developers remember that, I believe they’re going to be successful with VR.
Microsoft’s HoloLens was also an interesting newcomer to E3. I saw a couple of demos wearing the HoloLens augmented reality glasses. Like the first time I saw the Oculus Rift a couple of years ago, the HoloLens prototype was bulky and had a lot of wiring. When you put this on your head, you can still see the world around you, as the glasses are see through. But you also see animated holographic images that are overlaid on the lenses.
When you first see this, it’s a magical experience. I put it on in a cube-shaped room and had to look for signs that these bug-like alien creatures were breaking through the walls around me. I had to stand on my feet and constantly turn around to keep the enemies from sneaking up on me. As they crashed through the walls, I had to fire back at them. It was hard work, and I was sweaty by the end of it. For that short period of time, I really felt like I was fending off the aliens.
Then I used the HoloLens glasses to look at a Minecraft game. I was able to see the holographic image of a build from all angles. I could lift the top off the building and look down below it. I could also follow a character around and look at it from all points of view. I could really visualize how to make changes to the game scene and benefit from changes in perspective.
But after I took off the HoloLens headset, some of the magic wears off. After all, I realized that the field of view was very limited. You can only see animated creatures in a relatively small, concentrated space in front of your eyes. If you look to the right or left, you will see the real world, not an animation. This limited field of view is a big restriction for HoloLens now. That means game designers are going to have to craft experiences that take this limitation into account.
I still haven’t seen Magic Leap, which is making augmented-reality goggles. Its ambition is to create computer-animated objects that are indistinguishable from real-life objects when viewed through its glasses. And I figure Google is doing more on this kind of technology than just purely investing in Magic Leap.
Tim Merel, managing director of Digi-Capital, a game market advisory firm, believes that the bulk of the combined $150 billion VR/AR market by 2020 will consist of revenues from AR. But when it comes to gaming, I think VR is going to be much more immersive.
Shinra Technologies and Improbable have also shown us ideas of what happens to games when you apply supercomputing to gaming. Those companies are creating simulation technologies that will allow even small indie game companies to create massive simulated worlds, with endless numbers of objects and landscapes.
While many other technologies are expensive, I really admired a new tech that made gaming seem much cheaper and affordable. Hello Games has a team of just 10 people, but it showed off No Man’s Sky, a galaxy exploration game with an endless amount of detail to explore. Instead of hiring an army of artists to create new planets, space ships, and aliens, Hello Games created algorithms that allow it to generate new environments in a procedural fashion, where new creatures are automatically generated based on general instructions from artists. Hello Games has figured out a way to make an endless amount of content without spending a ton of money on an endless number of artists and programmers.
All of these cool technologies mean that gaming is going to broadly benefit from improvements on technology on all fronts, including graphics, VR, input systems, and server technology.
I also see some cool mobile technology coming down the road, but I can’t yet talk about it. When it arrives, you’ll be the first I tell.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More