Love robotics? Fill out the form below to stay
abreast of the latest news, research, and business
analysis in key areas of the fast-changing
robotics industry
Subscribe to Robotics
Trends Insights


 
Sponsored Links

Advertise with Robotics Trends
[ view all ]
Industry and Manufacturing
Bookmark and Share
STORY TOOLBOX Print this story  |   Email to a friend  |   RSS feeds
Control System Improves Human-Robot Interaction
Using sensor to map human muscle movement, robots in manufacturing line can anticipate operator motion.
By Dave Greenfield, Automation World - Filed Jan 22, 2014

More Industry and Manufacturing stories
Robot use in manufacturing is nothing new. Most industrial robots, however, operate in caged areas to protect humans from their movements. A few robots are used directly by people, in close quarters, for applications such as automobile assembly. For example, a person hanging a car door on a hinge uses a lever to guide a robot carrying the door. These types of robots are often referred to as examples of haptic technology.

Though these robots are a big help to the assembly workers, the interaction between humans and these robots can be more complicated than it appears.

“It turns into a constant tug of war between the person and the robot,” says Billy Gallagher, a recent Georgia Tech Ph.D. graduate in robotics who is leading the project to develop a control system that improves human-robot interaction. “Both [the person and robot] react to each other’s forces when working together. The problem is that a person’s muscle stiffness is never constant, and a robot doesn’t always know how to correctly react.” (See video at bottom of article.)

Explaining the confusion human movements can create for a robot, Gallagher says as human operators shift the lever forward or backward, the robot recognizes the command and moves appropriately. But when they want to stop the movement and hold the lever in place, people tend to stiffen and contract muscles on both sides of their arms. This creates a high level of co-contraction.

This co-contraction leads to the robot becoming confused. “It doesn’t know whether the force is another command that should be amplified or is a ‘bounced’ force due to muscle co-contraction,” said Jun Ueda, Gallagher’s advisor and a professor in the Woodruff School of Mechanical Engineering. “The robot reacts regardless.”

The robots response to the bounced force creates vibration. As a result, human operators react to this vibration by stiffening their arms and, thereby, creating even more force. All of this leads to the vibrations becoming worse.

The control system developed at Georgia Tech eliminates the vibrations by using sensors worn on a controller’s forearm. The devices send muscle movements to a computer, which provides the robot with the operator’s level of muscle contraction. The system judges the operator's physical status and intelligently adjusts how it should interact with the human. The result is a robot that reportedly moves easily and safely.

“Instead of having the robot react to a human, we give it more information,” says Gallagher. “Modeling the operator in this way allows the robot to actively adjust to changes in the way the operator moves.”

Improvements will continue to be made to the control system using a $1.2 million National Robotics Initiative grant supported by National Science Foundation grant #1317718  to better understand the mechanisms of neuromotor adaptation in human-robot physical interaction.

This research is intended to benefit communities interested in the adaptive shared control approach for advanced manufacturing and process design, including automobile, aerospace and military.

The video below shows how the control system works to improve the interaction between humans and robots.

 


Bookmark and Share
STORY TOOLBOX Print this story  |   Email to a friend  |   RSS feeds
  FOLLOW US
Facebook
Now you can follow Robotics Trends and
Robotics Trends Business Review on Facebook