The 2014 International CES offered a lot of bright futures for gamers. Valve’s Steam Machines and the Oculus Rift’s new virtual reality prototype captured a lot of attention.
But the aisles were full of cool game tech. There was a large gaming showcase in the South Hall of the cavernous Las Vegas Convention Center, and it was full of small innovations in game technology. I’ll use this column to describe some of the lesser-known ones for those of you who didn’t have a chance to brave the crowds of 150,000 people and the 3,200 exhibits that covered 2 million square feet of space.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":884383,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"D"}']Hopefully, one of these things will take off, as the game industry is definitely ripe for disruption. The $24 billion U.S. game business grew only slightly in 2013. New consoles and better mobile and online games will breathe life into that business. Worldwide, the $70 billion game business is expected to grow to $100 billion by 2017, according to Digi-Capital. But to make that happen, the industry will have to deliver much better new experiences to wider audiences of gamers.
Here’s a snapshot of some of the things that I saw at the show. Let’s hope they’ll spark a gaming revolution and introduce a greater variety of gaming technology into our lives than the next-generation consoles have done so far.
Oxide’s Star Swarm demo
Advanced Micro Devices showed off its new code-named Kaveri processors, which will power the next generation of gaming PCs. Those chips featured a new applications programming interface, Mantle, that allows developers to better exploit the performance of a PC. Oxide took Mantle and ran with it. The Hunt Valley, Md.-based game studio has created a next-generation game engine Nitrous that takes advantage of Mantle to prioritize graphics tasks.
The result was a stunning demo called Star Swarm, which reminded me of the huge space battles from the Star Wars: Return of the Jedi movie and the Homeworld game. But this demo had anywhere from 3,000 to 5,000 starships fighting it out in a gigantic battle, according to Tim Kipp, a co-founder of Oxide Games. Not bad for a small team founded last year by four game developers with funding from Stardock to make a next-generation game engine.
“We really wanted to see what we could do in pushing real-time strategy games forward,” said Kipp in an interview with GamesBeat.
Star Swarm uses the Mantle API to make more efficient use of graphics processors so that it can put many more units on the screen at the same time. Oxide is making its own game based on the engine, and it has two other licensees working on games as well. It hopes the new engine will lead to a new generation of strategy games. It will release a version of Star Swarm for modders in the first quarter.
Anki Drive with a software upgrade
Boris Sofman, chief executive of Anki, gave us our own personal demo of the newest upgrades for the Anki Drive racing game, where humans try to beat artificial-intelligence-based cars in combat racing matches.
The company launched its iPhone-controlled racing game in October and it has launched a major software upgrade for the game. With Anki Drive, you use your iPhone to control a race car in competition with other artificial intelligence-driven cars speeding around a physical track. The track detects the car’s location and movement 500 times a second.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":884383,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"D"}']
The track mat has an ink that is embedded with a code. That tells the car where it is on the track and where it is relative to other cars. Each car also a tiny camera under it that can read that information. (Yes, this is why the game costs $200.) The cars have a 50-megahertz processor, a multicolor LED, and Bluetooth low-energy radio to communicate with the smartphone.
Sofman showed off a feature known as Reverse Drive, which lets you do a 180-degree turn and change directions. You can fire a weapon to take someone out or just drive against the flow of traffic. A car will light up and slow down if it gets hit with a weapon, and the user’s iPhone screen shows what happened.
There’s also a new Kinetic Brake, which brings you to a dead stop on the track by engaging an emergency brake. There’s an electromagnetic pulse that lets you damage opponents as you pass them. And there’s a horn to force others out of your way.
Intel’s Hoplite gesture-controlled game
The RealSense 3D depth camera from Intel packs a lot of technology into a small webcam. It still needs some refinement, but Intel executive Mooly Eden showed me a game called Hoplite. Built by Intel’s internal developers for a demo, you use your hand and finger gestures to create bridges so that the little Hoplite soldiers can cross over chasms and other obstacles. If you drop them, they fall into the burning lava below.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":884383,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"D"}']
It may not seem like much, but it’s a start toward close-range gesture-controlled gaming that you can do at a PC. It is the first of what Eden calls “perceptual computing” input devices that will change the user interface of computing. And it put a smile on my face. I know because Eden took a picture of me while I was playing it, and it showed me grinning like a fool.
Nvidia’s Unreal Engine 4 demo on Tegra K1
Mobile games don’t have to have a quality gap with console and PC games anymore, thanks to the newest mobile processor from chip maker Nvidia. The Tegra K1 is a super chip with 192 graphics processors on it, compared to just 72 on last year’s Tegra 4.
Nvidia senior vice president Tony Tamasi told us that means it can effortlessly run games based on Unreal Engine 4, as you can see in this video demo from the graphics chip maker’s CES press conference. It can show special effects like global illumination (with a single light source like the sun to cast light and shadows through a game world), high-dynamic range for bright and dark details in the same frame, water and smoke effects, and human faces that look like real people.
The colors look vibrant, the details are rich, and the shadows are incredible for a mobile game demo. It would make sense for Nvidia to put the Tegra K1 in a Shield 2 portable gaming device, but Nvidia hasn’t announced it yet. The Tegra K1 will power some 4K UltraHD tablet games that will be truly stunning, even next to your newest game console.
[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":884383,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"D"}']
The Avegant Glyph
This headset shines images directly on your retinas and blasts sound into your ears at the same time. The Avegant Glyph is a virtual retinal display that gives you the experience of watching an 80-inch TV. Except there is no TV, as the Glyph uses optics to shine moving images directly on your eyeballs.
The Glyph is the latest in a whole string of innovative human-machine interfaces that promise to turn traditional gaming upside down. It is an alternative vision to Oculus VR’s 360-degree virtual reality, as the first generation of the Glyph has a 40-degree field of view and a 720p video image. Over time, though, the gear will get less cumbersome, and the visual quality will become more stunning. It is already immersive, and it is probably going to be cheaper than the cost of an 80-inch TV.
Eye-tracking control system for games from Tobii and Steel Series
Carl Korobkin of Tobii showed me how his company’s eye-tracking control system works when it comes to controlling games.
Stockholm-based Tobii has developed its eye-tracking technology for more than a decade, and it’s ready to partner with gaming peripheral maker SteelSeries to create a version for mass-market applications such as gaming. They promise precision control in games where you use your eyes to target enemies and fire much quicker than if you had a controller in your hands.
[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":884383,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"games,","session":"D"}']
Tobii’s technology uses a camera-like device to detect your eyeball in three dimensions and then determines precisely where you are looking on a screen. It was a little rough around the edges at CES. You stare into a webcam-like imaging device, which calibrates your eyes so that it can track them. Once it does that, you can play a variety of demos, like using your eyes to tell your World of Warcraft character where to move. I used it to play StarCraft II, using my eyes to target where I wanted my squads of Terran soldiers to move.
Korobkin showed another demo of a soccer game, where I was shooting penalty kicks at a goalie. If I looked left, the goalie would anticipate my move and dive to the left, blocking my kick. But I could also fool him by looking to the right and kicking to the left. That gameplay was based on the eye-tracking software looking into my eyes to judge my intentions. And that was pretty cool. The technology should get better over time, and I think it will be a very interesting user interface for games of the future.