The robotic arm glided slowly toward the woman’s face, bringing the bottle and its straw ever closer. The arm stopped and the woman parted her lips. She took a sip and smiled.
What was remarkable about this demonstration, recently conducted by BrainGate, a federally-funded research project, was not that a robotic arm could be made to move both gently and precisely (that’s been possible for many years), but that the machine’s actions were controlled by the woman’s thoughts alone.
In research labs worldwide, scientists are looking to make “mind over matter” an everyday activity. With the help of sophisticated sensors and high-powered computer technologies, one of robotics’ holy grails is finally coming within reach: machine control via thought. For the woman participating in BrainGate’s demonstration, the event was also a personal milestone: it marked the first time since suffering a stroke 15 years ago that she was able to sip a drink without assistance from a caregiver.
BrainGate’s neural interface system connects a sensor that detects and monitors brain signals to computer software and hardware that transforms the signals into digital commands recognized by external devices. The sensor is a baby aspirin-sized square of silicon containing 100 hair-thin electrodes that individually record activity within small groups of brain cells. The device is implanted into the motor cortex, a part of the brain that directs movement.
In the trial described at the beginning of this article, the 58-year-old woman learned to perform complex tasks with a robotic arm by imagining the movements of her own arms and hands. For one task, a series of foam targets were mounted below a tabletop and programmed to pop up individually at different positions and heights. The woman had less than 30 seconds to grasp each target using a second-generation DEKA Arm System developed by Manchester, N.H.-based robotics manufacturer DEKA.
The trial also evaluated a DLR Light-Weight Robot III arm, developed by the German Aerospace Center, headquartered in Oberpfaffenhofen-Wessling, Germany. The DLR arm, designed for use as an external assistive device, is heavier than the DEKA device. The woman used this arm prior to the DEKA arm in the foam target task and had a success rate of 21 percent. Using the DLR arm to reach for the bottled drink, bring it to her mouth and sip from a straw, she was able to complete four out of six attempts.
John Donoghue, BrainGate’s technology development leader and director of the Institute for Brain Science at Brown University, believes that it will only be a matter of time before mind-controlled limbs become an everyday reality. “We’re getting closer to restoring some level of everyday function to people with limb paralysis,” he says.
Yet a recent study conducted by neuroscientists at the University of California, Berkeley, and the Champalimaud Center for the Unknown in Portugal indicates that the brain is more flexible and trainable than previously thought. The study showed that through a process called “plasticity,” parts of the brain can be trained to do something it normally does not do. In other words, the same brain circuits used in the learning of motor skills, such as riding a bike or driving a car, can be put to work mastering purely mental tasks, even arbitrary ones.
“What we hope is that our new insights into the brain’s wiring will lead to a wider range of better prostheses that feel as close to natural as possible,” says Jose Carmena, a UC Berkeley associate professor of electrical engineering, cognitive science and neuroscience. “They suggest that learning to control a BMI (brain-machine interface), which is inherently unnatural, may feel completely normal to a person, because this learning is using the brain’s existing built-in circuits for natural motor control.”
The BrainGate researchers next plan to test their technology in more volunteers. They envision a system that would be stable for decades, wireless and fully automated. The current system requires the sensor and its user to be connected to the robot arm and support hardware via cables. During the trial, prior to each session, a technician had to perform a calibration procedure that lasted an average 31 minutes. The researchers are also looking to enhance the system’s precision and control speed. In the foam target test, for example, a successful reach-and-grasp motion typically took almost 10 seconds to complete.
Still, the researchers feel that they are not only advancing technology, but also the human condition. Project lead investigator Leigh Hochberg, an associate professor of engineering at Brown University in Providence, R.I., and a critical care neurologist at Massachusetts General Hospital/Harvard Medical School in Boston, says he felt gratified by the woman’s response to the tests. “The smile on her face was a remarkable thing to see,” he says. “We were encouraged that the research is making the kind of progress that we had all hoped.”
Putting on the Brain Cap
In a research project currently underway at the University of Maryland, a “brain cap” technology is allowing wearers to turn their thoughts into robotic motion without first having to endure a brain sensor implant. Associate Professor of Kinesiology José ‘Pepe’ L. Contreras-Vidal and his team set out to create the non-invasive, sensor-lined cap in an effort to improve the lives of millions of people. “We are on track to develop, test and make available to the public—within the next few years—a safe, reliable, noninvasive brain computer interface that can bring life-changing technology to people whose ability to move has been diminished due to paralysis, stroke or other injury or illness,” he says.
Contreras-Vidal and his co-researchers are currently collaborating on a variety of projects with scientists and engineers at other institutions to create thought-controlled robotic prosthetics. “We are doing something that few previously thought was possible,” Contreras-Vida says. “We use EEG [electroencephalography] to non-invasively read brain waves and translate them into movement commands for computers and other devices.”
In a National Institutes of Health (NIH)-supported project currently in progress, Contreras-Vidal and his colleagues are pairing their brain cap’s EEG-based technology with a Defense Advanced Research Projects Agency (DARPA)-funded next-generation robotic arm. The device, developed by researchers at the Johns Hopkins Applied Physics Laboratory, is designed to function like a human limb. In another collaboration, team is working with the New Zealand’s start-up Rexbionics, the developer of a powered lower-limb exoskeleton called Rex that could be used to restore gait after spinal cord injury.
Contreras-Vidal notes that a great deal of progress has been made over the past couple of decades in the study of direct brain to computer interfaces, most of it through studies using monkeys with electrodes implanted in their brains. Most humans, however, don’t want holes drilled into their heads or wires attached to their brains.
“EEG monitoring of the brain ... has been largely ignored by those working on brain-machine interfaces, because it was thought that the human skull blocked too much of the detailed information on brain activity needed to read thoughts about movement and turn those readings into movement commands for multi-functional high-degree of freedom prosthetics,” Contreras-Vidal says.
The brain cap aims to make using a thought-controlled robot as easy as putting on a hat.