Sony has received a lot of attention for Project Morpheus, its virtual-reality goggles that it unveiled in March. But an equally cool demo that came from the same tech-development lab is eye-tracking, where you can control where you’re aiming in a video game with your eyes. Simply by looking at something, you can pinpoint a reticle on the target without ever touching a game controller.
At this year’s Game Developers Conference and the recent Neuro Gaming event, Sony demonstrated how the eye-tracking works with Infamous: Second Son, a popular open-world superhero game for PlayStation 4, that it had modified for the eye-tracking input system. I played around in a limited environment of the game as Delsin Rowe, the protagonist. When I looked around the game world with my eyes, the camera followed.
[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1480710,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"C"}']“If you look at different things in the environment, the camera moves to where you are looking,” said Eric Larsen, an engineer at Sony Computer Entertainment America’s Magic Lab research-and-development unit, in an interview with GamesBeat. “It’s becomes like a passive activity. You don’t have to consciously direct your eyes. You just look at different things.”
The movement didn’t make me dizzy or disoriented. It was very natural. It didn’t tire my eyes. The camera centered exactly on what I was gazing at, whether it was a car or a postal mail box.
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
It started getting really interesting once I could throw fireballs with my eyes. I started blowing up signs, cars, and newspaper stands. I came to the familiar Seattle Space Needle area in Infamous and challenged a bunch of guards. They started shooting at me, and I fired back. I did so by looking at a particular guard and then pulling the R2 trigger on the PS4 controller. It was fast and accurate, once I got the hang of it.
During this process, I never had to move an analog stick on a controller. The eye-tracking still has challenges, like spinning around 180 degrees. But that’s why it could work as complement for the handheld controller, rather than a replacement for it.
To prime the demo for your eyes, you have to sit in front of an infrared sensor and allow it to calibrate. I had to stare at a ball and watch it move around the screen. The process takes seconds, though I had to do it twice to get it working right. It worked fine even though I was wearing glasses.
Larsen said the demo used an off-the-shelf infrared camera from SensorMotoric Instruments. It controls the lighting and is precise enough to detect features like your eyes and their movements. It’s harder to use this kind of solution when you’re sitting far away from the camera, like on your couch while staring at a big-screen TV. The solution I used was close-up, like in a PC game.
“We have to evaluate the timing when all the right parts would be in place to do this right,” Larsen said. “Right now, we’re just communicating what we can do.”
One day, Sony might be able to combine the eye-tracking technology with virtual reality for a more immersive experience.That’s why it’s part of Richard Marks’ Sony Magic Lab research-and-development facility in San Mateo, Calif. The eye-tracking comes from the same team behind the Project Morpheus VR headset, the PlayStation Move, and the PlayStation Camera.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1480710,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"C"}']
“We want a system that is robust and accurate because we are creating an illusion,” Larsen said. “We want a game experience where you are expressing intention through your eyes. It overlaps with what you have to do in the game. We want to make so you don’t have to process it much. It’s a different kind of experience, like the illusion of having a special power. You direct actions with thought, and so it is sort of like a brain interface. It is interpreting what you desire is.”
Larsen said that it will be interesting to see what will happen when competitors get to see your eyes in multiplayer matches. They might be able to figure out if you’re lying to them in a game match. Or the game itself could focus on your eyes and adjust what happens in a single-player game.
“This is just one application, but it could also be used so the game responds to what your eyes are doing,” Larsen said. “They can infer your intention and predict what your actions will be. We’ve explored that in technical demos but haven’t integrated that into what we’re showing yet.”
As for shooting, “You might find you’re not using the analog stick as much, or at all,” Larsen said.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1480710,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"C"}']
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More