Advanced facial recognition technology being developed at the University of Dayton will recognize specific individuals and evaluate expressions to assess pain
Sending medical first responders into battle may be getting a little safer thanks to research at the University of Dayton.
Researchers are developing robots with facial-recognition capabilities that can be air-dropped into dangerous areas, reducing the danger to humans susceptible to ambush by enemies posing as wounded.
"There are a lot of wounded people on the field, but you want to bring home the right person. How do you know if they're on the right side? You don't want to send another soldier, but we can send a robot," said Vijayan Asari, Ohio Research Scholar in wide-area surveillance in the University of Dayton School of Engineering.
Asari said his group has been working with Old Dominion University on developing robots with "sight" capabilities that can detect and help transport wounded individuals. The partners received $1.6 million for the first phase of a project that started in 2009 for the Telemedicine & Advanced Technology Research Center. The center is part of the U.S. Army Medical Research and Materiel Command.
The machines would first remove all environmental distractions such as smoke, fire, haze and rain; then identify that the injured is indeed a human. Asari hopes to develop the robot's capability to do so at around 300 feet.
"Once you have a clear picture of the face, recognition can happen in milliseconds," Asari said.
If the robot identifies the soldier as the correct one by crosschecking their features with a database, the robot will go closer to assess whether the robot can evacuate the injured soldier or a full rescue crew needs to be dispatched.
"We want it to be as a human sees," Asari said. "We want the robot to be able to identify bleeding or whether the patient can't move their hand. If the robot can identify a hand or arm injury, it will not try to lift the patient by the hand or arm."
Asari has been working on facial recognition software at Old Dominion and the University of Dayton for years. But with this project, he is expanding his research into facial dynamics, a new concept taking recognition a step further than just physical facial features.
Robots will use facial dynamics to identify how a person moves his mouth when he or she talks, smiles or yells, or what expressions people make when they're in pain. Controllers in the field also can use the dynamics system to recognize whether a person is sad, happy, angry or in distress.