UC Santa Cruz researcher Jacob Rosen is developing a robotics exoskeleton that has the potential to help the disabled and infirm.
Like many others, University of California, Santa Cruz researcher Jacob Rosen is developing a robotics exoskeleton that has the potential to help the disabled and infirmed. Unlike other exoskeleton projects, however, Rosen’s design allows the user to reach 95 percent of the natural range of motion. Also, Rosen is looking into using neurological signals to control the exoskeleton.
By Tim Stephens
Jacob Rosen is developing a wearable robotic “exoskeleton” that could enable a person to lift heavy objects with little effort. It’s a bit like the robotic armor that has long been a staple of futuristic battle scenes in science fiction books and movies. But what excites Rosen is the device’s potential to help people disabled by stroke or degenerative diseases.
“People with muscular dystrophy and other neuromuscular disabilities could use the exoskeleton to amplify their muscle strength, and it could also be used for rehabilitation and physical therapy,” said Rosen, an associate professor of computer engineering in the Jack Baskin School of Engineering at the University of California, Santa Cruz.
“One of the major challenges in this field is to establish an effective human-machine interface, or ‘bio-port,’ between the operator and the wearable robot, such that the robot becomes a natural extension of the human body,” he said. “This bio-port may be established at the neural level, allowing the human brain to control the wearable robot with the same type of signals that it uses to control its own actuators, the muscles.”
Rosen, who came to UCSC last year, joins a growing number of faculty members in the Baskin School of Engineering who are using their expertise to tackle biomedical problems. His medical robotics projects include a remotely operated robotic system for performing surgical operations, as well as the exoskeleton and related topics.
Rosen’s research focuses especially on the interface and interaction between humans and robots in these systems. “Medical robotics is by definition a multidisciplinary field, and that’s one reason I was so attracted to it,” he said. “One of the most challenging issues in research and development of medical technology is to create a multidisciplinary group of clinicians and engineers that can effectively communicate and collaborate. We speak different languages, and we have to overcome these barriers in order to work together. But the opportunities to benefit people’s lives are tremendous.”
Rosen’s current exoskeleton prototype, developed with a grant from the National Science Foundation, consists of two wearable robotic arms mounted on one wall of his UCSC laboratory. Inserting his arms into it, he notes that serious injury could result from a bad mismatch between the structure of the exoskeleton and the anatomical structure of the human arm.
Two innovations distinguish Rosen’s prototype from other exoskeleton designs. One is a special design of the exoskeleton arms that allows the user to reach 95 percent of the natural range of motion, or “workspace,” of the human arm. The other, which is still the subject of active research, is a method for using neurological signals to control the exoskeleton.
“By using the body’s own control signals as input to the exoskeleton, you can achieve a natural control of the robot by the human operator as an extension of his or her body,” Rosen said.
The system uses noninvasive surface electromyography (EMG), in which surface electrodes placed on the skin detect neural activity in individual muscles or muscle groups. The system takes advantage of a natural physiological time delay between the neural activation of a muscle and the actual movement generated by contraction of the muscle. During that time delay, a computer algorithm can analyze and process the EMG signals, which are then fed into computer models of the muscular system along with additional information regarding joint position and velocity.
The muscle models predict the joint movements that the operator’s muscles will generate in response to the neural signals detected by EMG, and these predictions are then sent to the exoskeleton so that it moves in concert with the operator’s arm.
Rosen has demonstrated this system for control of a single joint (flexing and extending the elbow). Now he is working to implement it to control the more complicated movements of the full exoskeleton arm, with funding from the U.S. Army Telemedicine and Advanced Technologies Research Center. He is also working on a lower limb exoskeleton to fit over the legs.
Some potential applications of the exoskeleton would not necessarily require the neuromuscular control system. For physical therapy, for example, the exoskeleton could be programmed to help a patient perform predetermined movements, Rosen said.
“When a stroke affects the motor cortex, there is typically paralysis on one side of the body. Rehabilitation relies on the plasticity of the brain to regain movement, but the learning process requires a lot of time in physical therapy,” he said.
Motor control in the brain may be reduced to the level of an infant first learning to engage with the world, he said. For an adult, it is frustrating and challenging, both physically and intellectually.
“The patient’s progress is often limited by the amount of time they’re able to spend with a physical therapist, so an automated physical therapy device could speed up the recovery process, and adding a virtual reality component may make it more intellectually engaging,” Rosen said.
The exoskeleton can also be used to provide force feedback in a virtual reality system--to create the sensation of holding or interacting with virtual objects, for example. “Haptic” technology provides an interface with the user through the sense of touch, and using the exoskeleton as a haptic device has many applications in scientific visualization and manipulation, gaming, and simulation, Rosen said. Haptic feedback is also needed in surgical robotics to give the surgeon a sense of touch as he operates the robot from a separate console that may be located in a remote site, Rosen said.
Rosen was a partner in a major research and development project, led by SRI and funded by DARPA, which envisioned the operating room of the future as a fully automated cell that includes only one human being--the patient--while a surgeon remotely controls a surgical robot and other automated systems. The project showed that an operating room can be fully automated, but there are enormous obstacles to achieving automation of the surgical procedure itself. Even automating something as routine as placing sutures requires sophisticated models of the biomechanics of soft tissue.
“This is a huge challenge. I’ll probably spend a large part of my career on this,” Rosen said.
Skeptics argue that there is no need to create robots to do what is already being done safely and effectively by surgeons. But a surgical robot can provide manipulations and views of anatomical structures and surgical sites that are not accessible to human fingers or within the line of sight of a surgeon. Rosen, who is currently editing a book on surgical robotics, argues that robotic systems could enable surgeons to explore procedures that would not otherwise be possible. “Sometimes you have to provide the tool first, and then people discover how they can use it,” he said.
Rosen and his collaborators plan to build a surgical robot in his new lab at UCSC. If he can get sufficient funding, he’d like to build seven duplicate systems and give them to other research groups throughout the United States. “The more people there are developing new algorithms for surgical robotics, the faster the field will progress,” he said.
Robotics Trends would like to thank the University of California, Santa Cruz for permission to reprint this article. The original can be found at http://www.ucsc.edu/news_events/text.asp?pid=2668.