Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":1873789,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,","session":"A"}']

The eyes have it: Eyefluence may have the answer to navigating AR/VR

Eyefluence leaders (left to right) CTO Peter Milford, CEO Jim Marggraff, and VP of marketing David Stiehr.

Image Credit: Dean Takahashi

A quick five-minute demo of Eyefluence‘s eye-tracking technology made me decide that it was one of the 10 best technologies of the the 2016 Consumer Electronics Show, the big tech trade show in Las Vegas last month. And an extended demo at the company’s headquarters in Milpitas, California, confirmed that for me last week. Eyefluence has figured out a way for you to navigate augmented reality and virtual reality screens without using your hands, and that could be a missing link in how to make these technologies live up to their promises for immersive worlds.

This small company is likely to get a lot more attention as people discover how fun AR and VR can be. It’s like something out of science fiction, and that’s why the company was able to raise $14 million in November from Motorola Solutions Venture Capital, Jazz Venture Partners, NHN Investment, and Dolby Family Ventures. Intel Capital is also an investor. It’s just another sign of the excitement surrounding this fledgling market, which tech adviser Digi-Capital predicts will reach $120 billion in revenues in 2020.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1873789,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,","session":"A"}']

At the entrance to Eyefluence‘s headquarters are some very big pictures of eyes. You can go up to them and see every vein and subtle detail of each eye, which belong to different employees at the company. These eyes are so big, they remind me of the eyes of T.J. Eckleburg, the symbolism-laden eyes on a billboard in F. Scott Fitzgerald’s novel The Great Gatsby.

Eyes tell us a lot about a person, and they are all different, said David Stiehr, the vice president of marketing at Eyefluence, in an interview with VentureBeat. The team at Eyefluence has been studying eyes and technology for eye tracking for the past 14 years, and they’ve come up with some very cool technology with applications in a wide array of industries, from healthcare to gaming.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

“A big piece of this is the language we’ve created to let the eyes take control,” said Jim Marggraff, the chief executive of Eyefluence, in an interview. “The syntax of the language is comfortable for your eyes.”

https://www.youtube.com/watch?v=I-xP2h0b8zU

“It’s all about the biology and the technology,” Stiehr said. “Biology doesn’t change. It’s been that way for millennia. How do you bring technology to the biology, to let your eyes do amazing things? It comes from a very firm understanding of the biology of the eye and the eye-brain connection. We believe that all head-mounted displays are incomplete without eye-tracking technology.”

The technology comes from researchers and serial entrepreneur Marggraff, who created Livescribe, the smart pen company that Anoto acquired in November. But in contrast to other eye-tracking technologies that have recently been announced, Eyefluence has taken a deep dive into figuring out how the eye works and how we can read intent in where the eye is focused.

Right now, it doesn’t look that pretty, as Eyefluence has jury-rigged an Oculus Rift headset with a small camera that can detect your eye movements. I went into a room with lots of wires and gadgets and screens. I sat down and used the headset. It took a minute or so to learn how to look at objects on a screen, select one, and drill down on it as if I were clicking on an icon with a mouse. It was pretty easy and natural. It required only a short calibration, and it didn’t make my eyes feel tired. Within moments, I was able to target different spots in front of the VR screen.

What was very cool about the demo is that it was smart. That is, it guessed at my intentions and what it believed I wanted to do with my eyes. When I looked away or blinked, the demo still worked. So it knew when I really wanted to control something.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1873789,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,","session":"A"}']

“We can calculate the point of regard,” Marggraff said, which means that Eyefluence figures out the point in front of you that your eyes are looking at. “That’s a very important idea in making an interface that is intuitive and easy enough for anyone to learn.”

I also watched Marggraff use it. I told him the icons to look at on the screen, which I could see, and which icons to target. As I said the words, he was able to pinpoint his gaze on the target immediately. For a gamer shooting at something, that would be very useful to do so quickly. The whack-a-mole style demo above illustrates how much faster you can target something with your eyes versus other interfaces. I was able to hit the moles at a very high speed, hitting the moles at an average of 0.28 seconds when using my eyes versus 0.4 seconds using my head movements to hit the moles.

Above: Eyefluence wants you to control devices with your eyes.

Image Credit: Eyefluence

I searched for Waldo in a Where’s Waldo image, using my eyes to zoom in or zoom out on the image. And I saw how I could turn pages with my eyes, rotate a globe, and look at a 40 different webpage screens floating in a 3D space. I could rapidly look at different pages and search through them very quickly. It worked in a pretty fluid way.

Marggraff also showed me how he could quickly navigate through a user interface overlay for augmented reality. The eye-tracking technology could tell whether he was looking at the overlay or if he was looking at something in the real world in front of him. He showed how the hardware can quickly identify you based on a scan of your iris within 100 milliseconds. And he showed me how a combination of a voice command and eye tracking enabled him to speak a message and send it instantly as a text message to someone else.

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1873789,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,","session":"A"}']

He also showed a heads-up display for a maintenance worker on an oil rig. Using AR glasses, Marggraff could see a checklist of things to be done. By looking at each task, he could quickly check off the items and get rid of them as he handled them. And he showed how you can take a picture of something by looking at it and then search for it on the Internet and identify it. Within a second or two, he identified who made a box of chocolates, and he used his eyes to order the chocolates from Amazon.

Sadly, Eyefluence’s tech won’t be immediately used in some of the first VR headsets coming out. The solution right now requires a custom chip on a tiny flexible circuit board that can be attached to a VR headset, AR glasses, and other kinds of products. It’s going to take some time before Eyefluence gets designed into things.

“There will be products in the next generation where eye tracking will be part of it,” Marggraff said.

Above: Eyefluence CEO Jim Marggraff wears an Oculus Rift VR headset.

Image Credit: Dean Takahashi

Other technologies are limited to making you gaze at an object for a long time or winking for a while. But Eyefluence is trying to figure out what you intend to do based on the behavior of your eyes and then make that thing happen.

[aditude-amp id="medium3" targeting='{"env":"staging","page_type":"article","post_id":1873789,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,","session":"A"}']

Eyefluence’s technology goes back to the 1990s. Researchers at a company called Eye-Com were trying to create ways to control devices with eye controls to benefit police and firefighters. But it didn’t really get off the ground. Marggraff met them in 2013, and he acquired the rights to the technology. Then he started Eyefluence as a new company with cofounder Dave Stiehr. They made the hardware for the camera small, and they designed it to be robust to any kind of sunlight and robust for any kind of eyes. Then they started creating a user interface.

The team has about 27 people, including some of the staff that used to work at the previous company. A third of the people have doctorates, and all but two of them are engineers.

“We said, let’s think about the experience first and create everything we need for that,” Marggraff said.

[aditude-amp id="medium4" targeting='{"env":"staging","page_type":"article","post_id":1873789,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,","session":"A"}']

Rivals include Tobii, which has worked with SteelSeries to create a product that lets you control a game with your eyes. Sony’s PlayStation Magic Lab has also shown off eye-tracking controls for its games. I’ve seen all of the technologies, and Eyefluence is on to something in the way that it has approached the problem. Eyefluence has about 30 granted or pending patents.

The company plans to license a suite of its eye-tracking algorithms and vision-driven user interface interaction model. Eyefluence hopes that its technology can be used in AR and VR devices in the healthcare, oil and gas, financial services, entertainment, assistive technology, and defense industries — and more. Marggraff said Eyefluence is talking with Fortune 500 companies about partnerships.

The challenge of using eyes to control something is very difficult as you can’t do anything that fatigues the eyes. I experienced this when I played a game called Call of the Starseed on the HTC Vive on a trip to Seattle. In that game, while wearing the Vive VR headset, I blinked to make my character move from one place to another. But if you keep on doing that throughout the game, your eyes get “incredibly fatigued,” Stiehr said.

https://www.youtube.com/watch?v=LUZgNB9C0co

[aditude-amp id="medium5" targeting='{"env":"staging","page_type":"article","post_id":1873789,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,","session":"A"}']

You also have to re-acquire the location of the eyes if someone looks away from a computer or a screen. With the Tobii technology, you have to calibrate a camera so that it can track your eyes using infrared light. Eyefluence calibrated in the same way. When you put on a VR headset with the Eyefluence cameras built into them, Eyefluence automatically detects your eyes. It is also designed to be robust in that it can track a wide variety of eyes, Marggraff said.

The technology filters out false direction, like when you’re blinking or looking away because your eyes are watering. Eyefluence pays attention to parameters such as eye structure, the color of the eye, the size of the pupil (which changes with lighting conditions), the size of the eyebrows and eyelashes, the nose bridge, and other pieces of the eye that have to be measured. Eyefluence has to understand how vision works and how the connection between the brain and the eye works.

In past years, eye-tracking systems for the disabled have used blinking, staring, or winking to control things. If somebody’s eyes are drowsy, your blinks take a long time. If you watch for those long blinks, then you can know if someone is about to fall asleep. If you’re driving a car, that’s pretty good information to detect. That’s another set of problems that Eyefluence might be able to address as Eyefluence’s main focus at the moment is AR and VR. If you have to dwell or stare at something too long, then that is tiring as well.

“Our prime directive is to let your eyes do what your eyes do,” Marggraff said. “We don’t force you to do strange things. We have developed a whole user interface around what your eyes do naturally so that it is fast and non-fatiguing. And easy to learn. We think of the range of things that can be presented in VR. You can surround yourself with a huge amount of information. We see that as a change in how humans can process information, solve problems, learn, and communicate.”

Above: Jim Marggraff, CEO of Eyefluence, wearing AR glasses with Eyefluence prototype hardware attached.

Image Credit: Dean Takahashi

https://www.youtube.com/watch?v=iQsY3uLvYQ4

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More