Wearing a Robot to Fight Paralysis

(image from comics.org)

(image from comics.org)

In Iron Man, Tony Stark engineers himself a robotic suit of armor that serves two purposes, fighting against the terrorists who took him captive while keeping pieces of shrapnel from puncturing his heart. Based on a new study from a University of Chicago neuroscience laboratory, wearable robots like Iron Man’s suit may also serve a dual purpose for a different type of user: quadriplegic patients.

Scientists, in an effort worthy of comic books, have successfully developed brain-machine interfaces that allow people to move computer cursors and prosthetic arms with their thoughts alone. When paralysis occurs due to a spinal cord injury or neurological disease, signals from the brain fail to reach the muscles of the body. But the brain electrical activity normally responsible for movement remains intact, and brain-machine interfaces (BMIs) seek to translate that information into the operation of an external device. One such BMI, called BrainGate, was successfully tested in quadriplegic patients 4 years ago.

However, while those patients were able to hit various computer targets and even type e-mails with their thoughts, their control of the cursor was somewhat shaky. When a person moves a computer cursor the old-fashioned way – with their hand on a mouse – information moves in two directions. Signals from the brain travel to the hand directing the movement, and sensory feedback goes back to the brain reporting on the movement’s success, both from the eyes tracking the cursor and from the location and movement of the hand in space. This latter sense, called proprioception or kinesthetic feedback, was not present in BrainGate trials; the patients’ had only visual feedback to help adjust their movement.

“In the early days when we were doing this, we didn’t even consider sensory feedback as an important component of the system,” said Nicholas Hatsopoulos, professor and chair of computational neuroscience at the University of Chicago. “We really thought it was just one-way: signals were coming from the brain, and then out to control the limb. It’s only more recently that the community has really realized that there is this loop with feedback coming back.”

Monkeys were taught to move a cursor with their thoughts to random square targets. When a robot moved their arm in tandem with the cursor to give the monkey additional sensory feedback (c), the paths to each target were straighter than when the monkey could only use visual feedback to control the cursor (b). (Hatsopoulos Lab/Journal of Neuroscience)

To test whether adding proprioception back in would improve the performance of a BMI, Aaron Suminski and Dennis Tkach added an additional component to the BMI set-up: an exoskeletal robot arm worn like a sleeve by the subject. Monkeys trained to move a computer cursor without moving their limbs wore the robot arm, which was programmed to move in tandem with the cursor’s movement. So while the monkeys operated the cursor with only their thoughts, the arm responded to the motion and provided kinesthetic feedback to the brain.

With this additional sensory information, use of the BMI improved. As reported in the The Journal of Neuroscience, the monkeys moved their cursors to the targets faster and on a straighter line than in trials without the robot arm providing feedback. The effect could also be seen directly in the brain, where activity in the motor cortex contained more information with the robot arm than without, demonstrated by an improved signal-to-noise ratio.

Even though the monkeys used in this study were not paralyzed in any way, the results may still be applicable to at least some quadriplegic patients, Hatsopoulos said. Many spinal cord injuries damage motor neurons but leave sensory neurons at least partially intact, leaving at least a residual system that can carry kinesthetic feedback to the brain. In the disease amyotrophic lateral sclerosis (ALS), motor neurons die out leading to paralysis, but sensory neurons are spared. Such patients could be outfitted with a wearable robotic device, similar to those being developed by scientists in Japan and elsewhere, that can be controlled by brain activity while simultaneously sending sensory information back to the brain.

For those patients lacking both motor and sensory function, benefits may be possible with just a little more complicated technology. In collaboration with Sliman Bensmaia (who we previously saw working on Luke Skywalker’s hand), Hatsopoulos’ laboratory is looking at ways to directly stimulate the sensory part of the brain, directly replicating kinesthetic feedback without having to physically move the arm at all.

“I think all the components are there; there’s nothing here that’s holding us back conceptually,” Hatsopoulos said. “I think using these wearable robots and controlling them with the brain is, in my opinion, probably the most promising approach to take in helping paralyzed individuals regain the ability to move.”

About Rob Mitchum (525 Articles)
Rob Mitchum is communications manager at the Computation Institute, a joint initiative between The University of Chicago and Argonne National Laboratory.
%d bloggers like this: