A brain–machine interface designed to let paralyzed subjects control devices with their thoughts performed better with the addition of a robotic arm providing sensory feedback, a new study from the University of Chicago finds.
Devices that translate brain activity into the movement of a computer cursor or an external robotic arm have already proven successful in humans. But in these early systems, vision was the only tool a subject could use to help control the motion. The new research uses a subject’s sensation of body movement to improve control of such devices.
A study published Dec. 14 in The Journal of Neuroscience reports that monkeys performed better with a brain–computer interface when their arms were linked with a robot arm that provided kinesthetic information about movement and position in space. Incorporating this sense may improve the design of “wearable robots” to help patients with spinal cord injuries, researchers said.
“A lot of patients that are motor–disabled might have partial sensory feedback,” said Nicholas Hatsopoulos, associate professor and chair of computational neuroscience at the University of Chicago. “That got us thinking that maybe we could use this natural form of feedback with wearable robots to provide that kind of feedback.”
In the experiments, monkeys controlled a computer cursor without actively moving their arm, via a device that translated activity in the primary motor cortex of their brain into cursor motion. But when researchers outfitted the monkeys with a sleeve–like, robotic exoskeleton that moved their arm in tandem with the cursor, the monkeys’ control of the cursor improved, hitting targets faster and via straighter paths than without the exoskeleton.
“We saw a 40 percent improvement in cursor control when the robotic exoskeleton passively moved the monkeys’ arm,” Hatsopoulos said. “This could be quite significant for daily activities being performed by a paralyzed patient that was equipped with such a system.”
Humans use sensory feedback, called proprioception, when they move their arms or hands. For example, when reaching out to grab a coffee mug, sensory neurons in the arm and hand send information back to the brain about where limbs are positioned and moving. Proprioception tells where the arm is positioned, even if their eyes are closed.
But in patients with conditions where sensory neurons die out, executing basic motor tasks such as buttoning a shirt or even walking becomes exceptionally difficult. Paraplegic subjects in the early clinical trials of brain–machine interfaces faced similar difficulty in attempting to move a computer cursor or robot arm using only visual cues. Those troubles helped researchers realize the importance of proprioception feedback, Hatsopoulos said.
“In the early days when we were doing this, we didn’t even consider sensory feedback as an important component of the system,” Hatsopoulos said. “We really thought it was just one–way: Signals were coming from the brain, and then out to control the limb. It’s only more recently that the community has really realized that there is this loop with feedback coming back.”
Reflecting this loop, the researchers on the new study also observed changes in the brain activity recorded from the monkeys when sensory feedback was added to the setup. With proprioception feedback, the information in the cell–firing patterns of the primary motor cortex contained more information than in trials with only visual feedback, Hatsopoulos said, reflecting an improved signal–to–noise ratio.
The improvement seen from adding proprioception feedback may inform the next generation of brain–machine interface devices, Hatsopoulos said. Already, scientists are developing different types of “wearable robots” to augment a person's natural abilities. Combining a decoder of cortical activity with a robotic exoskeleton for the arm or hand can serve a dual purpose: allowing a paralyzed subject to move the limb, while also providing sensory feedback.
To benefit from this solution, a paralyzed patient must have retained some residual sensory information from the limbs despite the loss of motor function — a common occurrence, Hatsopoulos said, particularly in patients with ALS, locked–in syndrome or incomplete spinal cord injury. For patients without both motor and sensory function, direct stimulation of the sensory cortex may simulate the sensation of limb movement. Further research in that direction is currently under way, Hatsopoulos said.
“I think all the components are there; there’s nothing here that’s holding us back conceptually,” Hatsopoulos said. “I think using these wearable robots and controlling them with the brain is, in my opinion, probably the most promising approach to take in helping paralyzed individuals regain the ability to move.”
The paper, “Incorporating feedback from multiple sensory modalities enhances brain–machine interface control,” appears in the Dec. 15 issue of The Journal of Neuroscience. Authors include Aaron J. Suminski, Dennis C. Tkach and Hatsopoulos of the University of Chicago; and Andrew H. Fagg of the University of Oklahoma.
Funding for the research was provided by the National Institute of Neurological Disorders and Stroke and the Paralyzed Veterans of America Research Foundation.