An obvious point of comparison for the work is human skin, which is a communicative wonder. The nerves in our skin convey temperature, pressure, shear forces and vibrations – from the finest breath of air to touch or pain sensations. At the same time, the skin is the organ by which we set ourselves apart from our environment and distinguish between environment and body.
The goal of the research at TUM is to develop an artificial skin for robots with a similar purpose: It will provide important tactile information to the robot and thus supplement its perception formed by camera eyes, infrared scanners, and gripping hands. As with human skin, the way the artificial skin is touched could, for example, lead to a spontaneous retreat (when the robot hits an object) or cause the machine to use its eyes to search for the source of contact.
Such behavior is especially important for service robots who assist people traveling to constantly changing environments. “In contrast to the tactile information provided by the skin, the sense of sight is limited because objects can be hidden,” explains Philip Mittendorfer, one of the researchers.
The centerpiece of the group's robotic skin is a five square centimeter hexagonal plate or circuit board. Each small circuit board contains four infrared sensors that detect anything closer than one centimeter. “We thus simulate light touch,” explains Mittendorfer. “This corresponds to our sense of the fine hairs on our skin being gently stroked.” There are also six temperature sensors and an accelerometer. This allows the machine to accurately register the movement of individual limbs, for example, of its arms, and thus to learn what body parts it has just moved. “We try to pack many different sensory modalities into the smallest of spaces,” explains the engineer. “In addition, it is easy to expand the circuit boards to later include other sensors, for example, pressure.”
Plate for plate, the boards are placed together, forming a honeycomb-like, planar structure to be worn by the robot. For the machine to have detection ability, the signals from the sensors must be processed by a central computer. This enables each sensory module to not only pass its own information, but to also serve as a data hub for different sensory elements. This happens automatically, ensuring that signals can go in alternative ways if a connection should fail.
Only a small piece of skin is currently complete. The 15 sensors incorporated within the prototype consist of at least one on each segment of a long robot arm. The assemblage already shows that the principle works. That is, a simple light pat or blow ensures that the arm reacts. “We will close the skin and generate a prototype which is completely enclosed with these sensors and can interact anew with its environment,” claims Mittendorfer’s supervisor, Prof. Gordon Cheng. Prof. Cheng adds that this will be “a machine that notices when you tap it on the back… even in the dark.”