Are We Close to Star Wars-like Prosthetics?

Imperial College London discusses how prototype sensor technology may make robotic prosthetics more user-friendly for people in the future.


Some robotic prosthetics currently in development are connected to the user via a socket, which seals around the stump, and detects electrical signals from muscle movements. One challenge with this technology is maintaining an electrical connection to the muscle via sensors placed on the skin inside the socket. For example, this connection can be lost because of sweat, which can inhibit the prosthetic hand’s ability to function.  Fabricating and calibrating the system for a user is also quite expensive.

The researchers from Imperial College London have developed a prototype sensor system that avoids this pitfall by detecting mechanical signals, instead of electrical signals, from tiny vibrations produced by muscle fibres as they move when muscles flex.  These vibrations can be sensed and passed to a robot hand to make it move in response to the user’s muscles like their own hand. 

The technology also has a motion sensor system to detect arm movements. This enables a user to control how the robotic hand grasps different sized objects, through a simple sequence of muscle flexes and arm movements.

The research was carried out by Dr Ravi Vaidyanathan and Samuel Wilson, a research postgraduate, who are both from the Department of Mechanical Engineering at Imperial College London. It builds on the work of Dr Richard Woodward, a former student of Dr Vaidyanathan’s, who was a Dyson Foundation scholar at the College.

The team caution that they have not yet carried out patient clinical trials with the technology. They still have a number of refinements to complete before the sensor and motion tracking system can be commercialised. However, the researchers believe their work is a step forward in making robotic prosthetics more robust in their design, which could widen their usefulness for patients.

Dr Vaidyanathan said: “Put your ear against your bicep and use your hand to form a seal between it and your arm. Now flex your arm and you should be able to hear your muscles rumbling and gurgling. We have developed a sensor technology that sits on your arm to detect these rumbles and another device that detects arm motions. This technology takes us a step closer to providing prosthetics that are potentially more robust, accessible and easier for patients to use in the future.”

Alex Lewis, from the Alex Lewis Trust, and Dr Ravi Vaidyanathan discuss the sensor technology

The team have carried out a preliminary demonstration of the system with Alex Lewis who lost both legs, his right arm and use of the other arm following a rare infection.

To operate the robotic hand Lewis had a small arm band placed around his bicep, which has a muscle sensor and motion tracking electronics embedded into it. When he flexed his bicep the vibrations made were detected by the sensor, interpreted as signals and transmitted to a computer. A program then executed a mathematical algorithm designed to isolate Lewis’s muscle signal and filter out other arm motions and sounds, converting it into a command for the robotic hand.

Lewis had the option of activating two different grip modes while the prosthetic lay detached from his arm on a lab bench. The first was a three fingered pinch that could enable movements such as picking up a small object like a set of keys. The other is a power grip that could enable the prosthetic to grasp a larger object such as a glass of water.

Future refinements to the technology will include isolating the range of vibration interference that may make the hand open or close unexpectedly. The team also plan to refine the device so that it is more portable and enables the user to self-calibrate. This would remove the need for engineers in the set-up process. Adding more sensors would also expand the range of commands, so that the prosthetic can do more complex grasping tasks.

Lewis, who underwent amputation surgery following very rare and devastating infection called Strep A Toxic Shock Syndrome, said: “It is really exciting to be part of this project to test the robotic hand system. Current prosthetics can be very cumbersome and impacting the body, so any technology that can reduce the burden on users is an important step forward. I look forward to seeing this device developed further.”

In the longer term, the team predict that the sensor system could also be adapted so that it could be used to control other technologies and appliances around the home, to further benefit people living with disabilities. They are also working with Imperial Innovations, which commercialises College research, to establish a start-up company to market the sensor and motion tracking technology.



Comments



Log in to leave a Comment

Article Topics

Health & Sports · Wearable Robots · News · Media · Videos · All Topics

in the Health & Sports Hub

Editors’ Picks

SpotMini Robot Dog Gets Major Makeover
Boston Dynamics introduced a new version of its SpotMini robot dog. The...

3 Ways AR/VR Are Improving Autonomous Vehicles
A lot of work still needs to be done before we start...

New Emotional Robotics Lab to Study Human-Robot Interaction
The University of Texas at Arlington has launched a new Emotional Robotics...

Inside the Autonomous Super Highway Race
As autonomous vehicles are speeding ahead toward adoption, one challenge that Cruise,...