Smarter Robot Cameras Sought by Military
Mind’s Eye research developing better machine vision
By Robotics Trends' News Sources - Filed Apr 13, 2012

Computer vision works much better than it once did, and that could enable a diverse range of machines to see and understand their environments. Such machines could be useful in everything from military scouting to self-driving cars.

That’s why the Defense Advanced Research Projects Agency, or DARPA, is doing research into vision in a program known as Mind’s Eye. James Donlon (pictured right), program manager for the Mind’s Eye project, said at the recent Embedded Vision Alliance summit in San Jose, Calif., that vision systems being tested now aren’t that bad at recognizing patterns such as a person about to be hit by a car that is backing up. But they still make mistakes that are sometimes comical, like mistaking a stationary object for a person or focusing on the wrong thing in a scene.

The Mind’s Eye research has been going on for about 18 months and is about half-way complete. After three years, the various vision projects will lead to lab prototypes that can eventually be brought to market. The systems being developed will do things like recognize someone walking, touching an object, or taking other actions. If the research pans out, we could see robots and other machines getting much better at the vision-based tasks that humans are best at.

 “The difference between how a machine can describe a scene and how a person would describe that scene is quite vast still,” Donlon said. “Solving this is what the Mind’s Eye program is about. So far, humans are still best at this.”

The program has about 15 teams working on various approaches. Donlon spoke to the Embedded Vision Alliance, which has a lot of chip makers as members, because technologists still need to make vision much more computationally feasible. But the task also requires a lot of software smarts aimed at making the hardware smarter. The technology starts with recognition, description, prediction and filling gaps in information, and anomaly detection.

 To teach machines how to filter out useless information, the Mind’s Eye researchers are showing all sorts of scenes to the computer-driven machines so that they can understand what is happening. Tracking people moving in a parking lot is doable today.

 “What we need to be able to do to make truly robust systems is to enable the systems to recognize anything without advance training,” Donlon said. “I’m absolutely thrilled at the progress we have made, but we are nowhere near where we need to be in the informativeness of the vision analysis or the efficiency of the computing. There are plenty of ludicrous results that go along with the good results.”

In military situations, better vision systems could enable more sensors on a battlefield to interpret meaningful actions, such as an enemy troop movement. Right now, that information is funneled to a command center like the one pictured. But DARPA wants to be able to move the intelligence to the edge of the network, so a camera sensor can send information directly to a soldier that needs it, Donlon said.

 --Courtesy Dean Takahashi VentureBeat

<< Return to story