Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":527952,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"C"}']

With Kinect-like gestures, SoftKinetic leads the way to Intel’s perceptual computing

With Kinect-like gestures, SoftKinetic leads the way to Intel’s perceptual computing

SoftKinetic's close-range gesture control technology is at the heart of Intel's upcoming non-touch control for laptops. The companies call this development the beginning of perceptual computing.

The age of perceptual computing starts with devices that can be controlled with the wave of your hand or the sound of your voice rather than just a mouse, keyboard, or touchscreen. Intel showed off an example of what it means today at its developer conference in San Francisco, and the startup behind that technology is SoftKinetic. With perceptual computing, you can control a computer through hand movements, face recognition, voice commands, touchscreen swipes, or mouse-and-keyboard controls.

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":527952,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,games,mobile,","session":"C"}']

Brussels, Belgium-based SoftKinetic makes gesture-control cameras and software, much like the elements used in Microsoft’s Kinect motion-sensing system for the Xbox 360 game console. But SoftKinetic has adapted its technology so that a laptop can sense movements by someone just inches away from it.

Intel believes that the close-range gesture-recognition technology is ideal for controlling thin and light laptops — dubbed ultrabooks — which resemble Apple’s MacBook Airs. SoftKinetic’s technology will be included in the software development kit (SDK) coming in 2013.

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

The technology includes SoftKinetic’s DepthSense camera that can detect finger movements while the company’s software takes the data from those gestures and simplifies it for a computer to process into specific operator controls. SoftKinetic can detect gestures as close as six inches away from the screen while Microsoft’s tech currently detects movement around eight-to-10 feet away or so.

“SoftKinetic is a pioneer and leader in the field of 3D gesture recognition who is helping us to make our vision of natural, interactive user experiences a reality,” said Achin Bhowmik, director of perceptual computing at Intel. “Gesture-based interaction enables humans to interact in a natural, intuitive way with their computer systems.”

Computer makers and software developers will use the Intel SDK to design computers with a wide variety of control schemes. This development, in turn, could help the PC deal with competition from smartphones and tablets. And that is strategic to Intel’s existence as the world’s largest chip maker. SoftKinetic is making games that will be included in Intel’s beta testing of perceptual computing, due to debut in the fourth quarter of 2013.

“We are excited to contribute our technology to the Intel perceptual computing program,” said Michel Tombroff, CEO of SoftKinetic. “We believe that natural-gesture interaction offers intuitive, engaging, and, most of all, personalized experiences for the consumers.”

Intel is demonstrating the technology at the Intel Developer Forum this week at the Moscone Center West in San Francisco.

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More