Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":1876310,"post_type":"preview","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,","session":"B"}']
Preview

Leap Motion introduces amazing Orion hand-tracking for virtual reality

David Holz, founder of Leap Motion, shows off hand-tracking in VR.

Image Credit: Dean Takahashi

No one knows the magic of hands like Leap Motion. And that’s why the San Francisco company is introducing a new hand-tracking system so that you can use your mitts in the new virtual worlds that are becoming possible with augmented and virtual reality.

The Orion hand-tracking product is a new generation of gesture control, one that Leap Motion promises is so natural and accurate that you’ll be able to use your fingers inside VR and AR worlds. I previewed the technology, and it worked really well. But it also made me wish that I could use my hands, arms, legs, and whole body inside virtual reality. It gives me a taste of the things that will come, and it makes me more excited than ever that we’ll be able to manipulate the things that we see in virtual reality in a way that is accessible to everyone. Indeed, if VR is going to be a $30 billion industry by 2020, as tech adviser Digi-Capital predicts, then we’re really going to have to find a way to use our hands.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":1876310,"post_type":"preview","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,","session":"B"}']

Above: Leap Motion’s hand-tracking technology.

Image Credit: Leap Motion

“We think hand-tracking is so necessary to elevate VR to the next level,” said Michael Buckwald, chief executive and cofounder of Leap Motion, in an interview with GamesBeat. “Orion is the first version of tracking that lives up to the vision of tracking hands and fingers so accurately that people can forget there is any separation between them and technology. It is the first version that is specific for virtual reality, and we think we can really transform VR in a fundamental way by letting people, as soon as they put on a headset, see their hands in front of them.”

Seeing your hands in VR is a really important part of creating presence, or the feeling that you are transported to somewhere else, Buckwald said. The Oculus Rift VR headset will debut with an Xbox One game controller as its primary input system when it launches on March 28. The HTC Vive will have two separate hand controls when it launches in April, and Oculus VR will launch its own separate hand controls, dubbed Oculus Touch, in the second half of the year. But none of those solutions promises to be as precise as Leap Motion’s hand tracking.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

“If you have no input, it’s more like watching a 3D movie,” he said. “Or it would be like interacting with the world without fingers. The simplest things become so hard to do.”

Above: You can grab things with two fingers using Leap Motion in virtual reality.

Image Credit: Dean Takahashi

Orion is both software and hardware. The software will be pushed out today to everyone who has already bought the first generation of Leap Motion’s gesture control computer peripheral, which debuted for Windows computers in 2013. Developers will be able to download and start creating applications for it immediately. The hardware detection system, which uses infrared technology, will be embedded in hardware products that will launch later this year, Buckwald said.

Leap Motion’s quest to develop hand-tracking technology was based on the dreams of David Holz, its chief technology officer and cofounder. His intention is to remove the barrier between people and technology.

Holz and Buckwald founded the company in 2010. And they found that while the rest of the computer world was digital, hands were really tough because they weren’t 1s and 0s. They are analog, and require precision tracking. They started working on a PC hand-tracking tool that resembled what you could do with Microsoft’s Kinect motion-sensing system for the Xbox 360 and Xbox One game consoles. But the new system was created for VR from the start. The older version couldn’t discern a hand when it was put against a background object, such as your hand on your pants.

Above: Leap Motion’s Orion in action.

Image Credit: Leap Motion

“We focused on hands for so long because we believe there is something magical about hands,” Buckwald said. “If you think of the complexity involved in just holding a glass of water, all of the muscles that have to be coordinated down to the millimeter, and that this is all automatic — this is miraculous. It’s very clear that hands can offer us something more.”

The technology works much better than the earlier version of Leap Motion, since it can track all of your fingers and joints. It can detect which way your hands are facing, whether your palms are open or clenched in a fist, and which way you are pointing the fingers in a 3D space. The tracking happens much faster than the previous model, and it works in situations where no previous software could keep up. It works when your finger are “occluded,” or where a sensor can’t detect them because of the way it is facing.

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":1876310,"post_type":"preview","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,","session":"B"}']

Holz held his hand on his pants and showed how the system could detect which part of the image was his hand and which was his pants. That means the system can detect hands even with a cluttered background, and it can do this in a variety of lighting conditions.

Holz showed me a demo called “blocks,” where he sat on the ground, put on an Oculus Rift headset, and then played with some virtual blocks. He picked them up and stacked them with his virtual fingers. He pulled with two fingers to create a new block, and then stacked it. He picked the blocks up and tossed them around. He knocked over the blocks with his hands. The demo shows off the interaction engine between the hand tracking, the physics engine, and the game engine.

It does have a couple of catches. Your hands have to be visible to you, meaning you have to be looking at them with your eyes, in order for you to do anything with them. You can’t point in one direction and point the other hand in another direction and expect to keep control over what your hands are doing. Still, that matches the way we behave.

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":1876310,"post_type":"preview","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,","session":"B"}']

“Rarely do we interact with anything with our hands without looking at them,” Buckwald said.

You also can’t feel your fingers. There’s currently no “haptic feedback” that tells you that you touched something. So the graphics have to be pretty precise when you are reaching out to grab a virtual block. You can’t feel that you are touching a block, so the only way to tell if you are touching a block is to try to make it move.

If anything, Orion teaches us that VR is going to be a moving target. It’s going to get better and better, perhaps pretty soon, and each new extension of it will seem like a huge leap forward.

Above: David Holz (left), CTO of Leap Motion, and Michael Buckwald CEO.

Image Credit: Dean Takahashi

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More