For more like this, check out the Intel Game Dev Channel

Oscar Werner, president of Tobii Tech, gave me a rundown of the company’s eye-tracking technology for games at CES 2017, the big tech trade show in Las Vegas earlier this month. And I’m finally getting around to relating that experience. With those demos, I’ve got a better idea about whether this will be the next great user interface for games.

Stockholm-based Tobii’s platform tracks your eye movements and uses them to control a computer.

Oscar Werner, president of Tobii Tech.

Above: Oscar Werner, president of Tobii Tech.

Image Credit: Tobii

I’m still figuring out just how revolutionary this will be. On the one hand, it’s very cool to use your eyes, and it’s faster too. On the other hand, it is something you have to learn to do. And as I learned in my interview with Synaptics CEO Rick Bergman, it can be very hard to teach humans how to use something new. But it’s unquestionable that Tobii and eye-tracking are gathering momentum.

The sensors have been integrated into a variety of laptops and displays, and around 50 titles now take advantage of the eye-tracking controls, Werner said in an interview with GamesBeat.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

“Going forward, devices have to understand who you are, what you are doing, and where you are looking,” Werner said.

The technology can authenticate people through iris recognition — in which case the computer needs to know where your iris is and what your eye looks like — and help control a game in a way that is faster than someone who is competing against you.

Among the laptops that use the technology is the Acer Aspire V 17 Nitro Black Edition, which debuted at CES. The Alienware 17 gaming laptop also has integrated Tobii Aware software and so does the MSI GT72 6QE Dominator Pro G laptop. Acer will also launch a curved monitor, the Acer Predator Z1, with Tobii built into it.

The Aspire V 17 Nitro is the first laptop to bring together Tobii's eye tracking technology and Micr ...

Above: The Aspire V 17 Nitro is one of the first laptops to bring together Tobii’s eye-tracking technology.

Eye tracking with Dying Light

Dying Light

Above: Dying Light’s zombies-and-parkour mixture attracted a lot of gamers in January.

Image Credit: Techland

Werner showed me how eye tracking works with Dying Light, Techland’s hit zombie-killing game that debuted in 2015. I only had a few minutes to learn what to do and form my impressions. But it worked.

With Dying Light, you have to deal with a lot of zombies coming at you at once. But with eye tracking, using a feature dubbed Aim@Gaze, you have a way to react quickly. If you are facing one direction, and a zombie comes at you from the side, you can press a button and hold it. You then look at the zombie target with your eyes, without changing the way your character is facing. You target the zombie, then release the button. And you throw a knife at that character. This is called Throw@Gaze.

You can also do something called MultiThrow@Gaze. In this case, you can target several zombies with your eye at once by looking at them one by one. When you release your button, you can throw a knife at each target simultaneously, taking out several targets at once. I got the hang of this quickly.

My conclusion is that you can shoot and target faster using Aim@Gaze eye tracking compared to using a computer mouse, just as using a mouse is faster than using a console game controller. It’s such an unfair advantage that Tobii in many cases doesn’t let you precisely target something. Rather, you get close to the target by using your eyes, and then you have to use the mouse or controller to finish the precise targeting. That’s a little more fair, both to mouse users and to the targets themselves.

I enjoyed throwing knives. That worked wonderfully well. But throwing knives is a very small part of that game, and I’m not sure how often I would use that trick in gameplay. You can do other things that will speed up how quickly you can shift your body while running.

If Tobii has a chance to build an extended tutorial into a game or software program, it will probably become a lot easier to learn.

“I agree, and we are working on a couple of games where we do that,” Werner said.

Eye controls for Watch Dogs 2

I've walked across this bridge, depicted in Watch Dogs 2, in real life.

Above: I’ve walked across this bridge, depicted in Watch Dogs 2, in real life.

Image Credit: Ubisoft

With Ubisoft’s Watch Dogs 2 hacking game, I was able to try what Werner called Hack@Gaze. I used a game controller to drive a car in the game. I was driving at high speeds, either chasing someone or being chased by the police. As you pass by other objects, humans, or cars, you can click on them in the game and hack them in an instant. In a car chase, you can hack a red-light signal and make cars get in a wreck behind you. Or you could hack a car ahead of you and make it run off the road.

With eye tracking, I was able to glance at other objects and target them using my eyes, rather than the cursor controlled by the game controller. I could drive straight and look to the side of the screen and hack that object quickly without losing my focus on driving. That was the theory, anyway.

In my short time with the game, I didn’t have much time to learn it, and I had problems doing the one-two coordination of targeting with my eyes and then executing a hack with the game-controller button. You can also use the eye tracking to target near an object that you want to shoot. That allows you to move the targeting reticle closer to the object you want to shoot in a split second. That helps you shoot faster and more accurately, though not so accurately that you hit something every single time.

“I think it’s more natural to aim with your eyes,” Werner said. “In computer games, with a mouse or controller, you look at something, then you turn your character and aim. Then you can shoot. That’s not as natural.”

Ubisoft also integrated the eye controls into Steep, its winter sports game that launched in December. With implementations like “Extended View” and “Clean UI,” players can see more of the screen at any given time with fewer on-screen distractions.

With Clean UI, for instance, you can look at a part of the screen where there’s a user interface. That part of the screen will disappear and instead show you the part of the environment that would otherwise be obstructed by that interface.

Productivity eye tracking

Tobii is built into the latest Acer gaming laptop.

Above: Tobii is built into the latest Acer gaming laptop.

Image Credit: Dean Takahashi

Not only does Tobii give you more intuitive control of a game. It also enables smoother workflow in productivity apps for Windows 10.

On the productivity side, Tobii thinks of the eye tracking as a “virtual touchscreen,” allowing you to look at something and move to that place on the screen. It takes some getting used to. You have to train a machine to see your eyes with a quick calibration program. Once calibrated, the machine will pretty much recognize you every time, Werner said.

To activate the eye tracking in Windows, you place a finger on the touchpad of the laptop. Then, you stare at a spot on the screen. If you click your finger, the cursor moves to that spot on the screen. It’s a lot faster than just moving the mouse or swiping the touchpad with a finger. Tobii calls this Touch@Gaze.

You can also do Scroll@Gaze. When you have multiple windows open on the screen, you can swipe two fingers on the touchpad and then look at a window. When you do that, you’ll scroll in the window you’re looking at. There’s also Zoom@Gaze, where you look at an area and pinch on the touchpad. Then, you’ll zoom in on the area that you are looking at.

“It takes a little getting used to,” Werner said. “But the computer knows your intentions a lot better than other user interfaces do.”

You can also use eye tracking to switch apps. If you hold the alt and tab keys, you’ll display all of the open programs on the screen. If you glance at one, then release the keys, you’ll be taken to that program.

“It’s bringing the interaction to the point where you use your eyes to do the touching,” Werner said.

Werner said that several new eye-tracking enhanced games are being added monthly. Tobii expects 100 eye tracking enhanced titles to be available by the end of 2017, including several more upcoming triple-A offerings.

Huawei has also created its Honor Magic smartphone with eye tracking built into it. I didn’t get to try that one. That phone, available only in China, uses Tobii to acquire data on the person’s presence and attention.

Tobii Eye Tracker 4C, Tobii's second generation gaming peripheral. (Photo: Business Wire)

Above: Tobii Eye Tracker 4C, Tobii’s second-generation gaming peripheral. (Photo: Business Wire.)

The Aspire V 17 Nitro will be available in February. Tobii also sells stand-alone accessories that can retrofit existing PC gaming hardware with eye-tracking capability.

Tobii will likely have some big competitors soon. Facebook acquired Eye Tribe for use with its Oculus Rift virtual reality headset. And Google acquired rival eye-tracking firm Eyefluence.

Eye tracking for virtual reality

At the Game Developers Conference in late February, Tobii and its customers will also show off eye tracking for VR headsets, Werner said.

With VR, eye tracking has a particularly useful purpose. Cameras built into the VR headsets will be able to detect where you are looking on a screen. Japan-based Fove recently opened preorders for a VR headset that lets you play games by moving your eyes.

You can control this shooter reticle with your eyes using Fove.

Above: You can control this shooter reticle with your eyes using Fove.

Image Credit: Fove

Eye tracking is critical to a technology called foveated rendering. With it, the screen will fully render the area that your eye is looking at. But beyond your peripheral vision, it won’t render the details that your eye can’t see.

This technique can save an enormous amount of graphics processing power. (Nvidia estimates foveated rendering can reduce graphics processing by up to three times.) That is useful in VR because it takes a lot of graphics processing power to render VR images for both of your eyes. VR should render at 90 frames per second in each eye in order to avoid making the player dizzy or sick. And foveated rendering — which will likely prove to be necessary for stand-alone VR headsets — doesn’t really work that well unless you know exactly where the person is gazing.

“This year, we are going to work very closely with foveated rendering in VR and on the PC,” Werner said.

CES2017