“We are trying to give humanoid robots the ability to behave and move more like human beings, to have the skill-learning capabilities of humans,” said C.S. George Lee, a Purdue professor of electrical and computer engineering who specializes in robotics.
Purdue will collaborate with researchers from the Advanced Institute of Science and Technology in Japan, which leads the world in humanoid-robot research.
“What we are going to try to do is capture the essence of how people learn movement skills,” said Howard Zelaznik, a Purdue professor of health and kinesiology.
The work is funded with a four-year, $900,000 grant from the National Science Foundation’s Information Technology Research program.
Humans are able to automatically combine a series of basic movements, such as pushing, lifting or grasping, to perform new tasks on the fly.
“For example, if I asked you to open a door and you were carrying two bags of groceries, you would know how to do that the first time through because you have in your repertoire the flexibility to combine old skills into new ones,” Zelaznik said. “We’d like to see whether we can figure out if there is a computationally reasonable way for a robot to take a set of skills and combine them into new skills rather efficiently, flexibly and quickly.”
Humanoid robots are robots that resemble people. A popular example of such robots is Honda Motor Co.’s “ASIMO,” which walks upright, has two arms, two legs and a head.
Although today’s humanoid robots represent an engineering feat, they do not move the way people do.
“They are very stiff and mechanical,” Lee said.
One important reason to teach humanoid robots how to quickly learn new movements is so that they will be better able to assist people.
“Imagine that a person in a wheelchair has just dropped his or her keys under the wheelchair, and the robot wasn’t programmed specifically to retrieve them from that location,” Zelaznik said. “We are trying to figure out how best to make that robot adaptable so that it can learn new skills quickly.”
The Purdue team, which includes four doctoral students, will use specialized equipment to record human movements in three dimensions.
Tiny coiled wire “receivers” will be placed around certain body parts, such as fingers and arms, as a person moves in a low-level magnetic field. As the person moves, these coiled wire receivers will induce a current, which will be tracked by laboratory computers. Lee and Zelaznik will then be able to see the basic movement patterns and hope to use that information to build mathematical models to make robots move more like people.
The ultimate goal is to create software that enables robots to combine several of the most “primitive” skills to perform more complex movements.
“We are not trying to make the robot perfect,” Zelaznik said. “People are not perfect. When we move, we are variable, we are imprecise, we make errors. We don’t exactly do the same thing time in and time out. We believe it is this imperfection that allows us the capability to be flexible.”
C. S. George Lee, (765) 494-1384,
Howard Zelaznik, (765) 494 5601,