Designed to bring the dexterity of an elephant's trunk to industrial robots, the appendage I am wrestling was launched by German engineering firm Festo as a proof-of-concept in 2010. The design showed that a trunk formed of 3D-printed segments can be controlled by an array of pneumatic artificial muscles.
But beyond a handful of motions, such as shaking hands – including once with German chancellor Angela Merkel – or grasping a bottle, the machine wasn't built with its own precision control software. "They deliver it without much control. You can try, but the arm will be centimetres from where it should be, which is no good," says Steil, an intelligent systems engineer at Bielefeld University, also in Germany.
That means people who aren't robot experts wouldn't be able to train it to carry out simple tasks, limiting its potential usefulness in the real world. But now Steil and his colleague Matthias Rolf have changed all that, as they told a human-robot interaction conference in Bielefeld last week.
They used a process called "goal babbling", thought to mimic the way a baby learns to grab things by continually reaching – a process of trial and error that lets them work out which muscles they need to move. Similarly, the robot remembers what happens to the trunk's position when tiny changes are made to the pressure in the thin pneumatic tubes feeding the artificial muscles. This creates a map that relates the trunk's precise position to the pressures in each tube.
The trunk can now be manually forced into a series of positions and learn to adopt them on command – in other words it can now be trained to repeat actions and pluck anything from light bulbs to hazelnuts.
I can vouch for that: as I move the bionic trunk in Steil's lab into different positions it initially resists, but then yields and follows my movement. The next time I try to push it to the same spot, it moves easily, because the behaviour has been learned. The robot now has muscle memory – which makes it seem even more alive.