The Mind’s Eye research has been going on for about 18 months and is about half-way complete. After three years, the various vision projects will lead to lab prototypes that can eventually be brought to market. The systems being developed will do things like recognize someone walking, touching an object, or taking other actions. If the research pans out, we could see robots and other machines getting much better at the vision-based tasks that humans are best at.
“The difference between how a machine can describe a scene and how a person would describe that scene is quite vast still,” Donlon said. “Solving this is what the Mind’s Eye program is about. So far, humans are still best at this.”
AI Weekly
The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.
Included with VentureBeat Insider and VentureBeat VIP memberships.
To teach machines how to filter out useless information, the Mind’s Eye researchers are showing all sorts of scenes to the computer-driven machines so that they can understand what is happening. Tracking people moving in a parking lot is doable today.
“What we need to be able to do to make truly robust systems is to enable the systems to recognize anything without advance training,” Donlon said. “I’m absolutely thrilled at the progress we have made, but we are nowhere near where we need to be in the informativeness of the vision analysis or the efficiency of the computing. There are plenty of ludicrous results that go along with the good results.”
Soldiers looking at command screens spend so much time looking at them that they may miss what is important and fail to pass on that information to soldiers in the field.
Right now, the military uses scout robots like those made by iRobot, pictured left, to do reconnaissance ahead of troops so that it can warn them of ambushes or other dangers. The robots have cameras on board, can point at an area, and remain concealed. They can then send back video footage that can be understood by human interpreters. But sending out the right video at the right time is critical.
[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":413513,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,","session":"B"}']
“This takes some human scouts out of harm’s way and creates more situational awareness,” Donlon said. “It ought to be possible to put the intelligence on the sensors, on the edge. The soldier can then be on the look out for anomalies.”
These kinds of technologies could have both military and civilian applications. You could, for instance, use the vision systems with surveillance cameras for private corporations. Vision could also be useful in car safety. Google is working on a self-driving cars project, for example, in hopes of reducing the more than a million car accidents a year.
“DARPA has a [history] of pioneering technologies that have become important applications,” said Jeff Bier, chief executive of market research firm BDTI and founder of the Embedded Vision Alliance, which has 19 corporate members from Analog Devices to Texas Instruments. “We hope that’s going to happen in this category as well.”
Developers for the Mind’s Eye program include: Carnegie Mellon University, Co57 Systems, Colorado State University, Jet Propulsion Lab/Caltech, the Massachusetts Institute of Technology, Purdue University, SRI International, SUNY at Buffalo, Netherlands Organization for Applied Sceintific Research, University of Arizona, UC Berkeley, USC, General Dynamics Robotic Systems, iRobot, and Toyon Research.
[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":413513,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,","session":"B"}']
[Photo credits: DARPA, Dean Takahashi]
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More