Monkey's Move Matter, Mentally

 

Using only signals from their brains and visual feedback on a video screen, rhesus monkeys have been taught by a team of Duke Medical Center researchers to control, consciously, the movement of a robot arm in real time. The monkeys appear to operate the robot arm as if it were their own limb, say the scientists.

Wave of the future: neurobiologist Nicolelis and robot-arm-working rhesus

Wave of the future: neurobiologist Nicolelis 
and robot-arm-working rhesus. Jim Wallace.

This achievement represents an important step toward technology that could enable paralyzed people to control "neuroprosthetic" limbs and even free-roaming "neurorobots" by using brain signals. Members of the research team--neurobiologists and biomedical engineers--say the technology they developed for analyzing brain signals from animals could also greatly improve rehabilitation of people with brain and spinal-cord damage from stroke, disease, or trauma. By understanding the biological factors that control the brain's adaptability, they say, clinicians could develop improved drugs and rehabilitation methods for people with such damage.

The report appeared in an article published online in the Public Library of Science. Heading the research was neurobiologist Miguel Nicolelis, a physician, professor of neurobiology, and co-director of the Duke Center for Neuroengineering. Jose Carmena was lead author of the article, and senior co-author was Craig Henriquez, associate professor of biomedical engineering in the Pratt School of Engineering, who is also the center's co-director. The research was funded by the Defense Advanced Research Projects Agency and the James S. McDonnell Foundation.

The latest work is the first to demonstrate that monkeys can learn to use only visual feedback and brain signals, without resorting to any muscle movement, to control a mechanical robot arm--including both reaching and grasping movements. In their experiments, the researchers first implanted an array of microelectrodes, each smaller than the diameter of a human hair, into the frontal and parietal lobes of the brains of two female rhesus macaque monkeys. They chose those areas of the brain because they are known to be involved in producing multiple output commands to control complex muscle movement.

The faint signals from the electrode arrays were detected and analyzed by the computer system the researchers had developed to recognize patterns of signals that represented particular movements by an animal's arm. In the initial behavioral experiments, the research team recorded and analyzed the output signals from the monkeys' brains as the animals were taught to use a joystick both to position a cursor over a target on a video screen and to grasp the joystick with a specified force.

After the animals' initial training, the researchers made the cursor more than a simple display--now incorporating into its movement the dynamics, such as inertia and momentum, of a robot arm functioning in another room. While the animals' performance initially declined when the robot arm was included in the feedback loop, the monkeys quickly learned to allow for these dynamics and became proficient at manipulating the robot-reflecting cursor, the team reported.

The researchers next removed the joystick. At that point, the monkeys continued to move their arms in mid-air to manipulate and "grab" the cursor, thus controlling the robot arm. "The most amazing result, though, was that after only a few days of playing with the robot in this way, the monkey suddenly realized that she didn't need to move her arm at all," says Nicolelis. "Her arm muscles went completely quiet, she kept the arm at her side and she controlled the robot arm using only her brain and visual feedback. Our analyses of the brain signals showed that the animal learned to assimilate the robot arm into her brain as if it were her own arm."

Analysis of the signals from the animals' brains as they learned, says Nicolelis, revealed that the brain circuitry was actively reorganizing itself to adapt. "It was extraordinary to see that when we switched the animal from joystick control to brain control, the physiological properties of the brain cells changed immediately. And when we switched the animal back to joystick control the very next day, the properties changed again. Such findings tell us that the brain is so amazingly adaptable that it can incorporate an external device into its own 'neuronal space' as a natural extension of the body. Actually, we see this every day when we use any tool, from a pencil to a car. As we learn to use that tool, we incorporate the properties of that tool into our brain, which makes us proficient in using it."

According to Nicolelis, the findings will have direct application to clinical development of neuroprosthetic devices for paralyzed people. "There is certainly a great deal of science and engineering to be done to develop this technology and to create systems that can be used safely in humans," he says. "However, the results so far lead us to believe that these brain-machine interfaces hold enormous promise for restoring function to paralyzed people." His team is already conducting preliminary studies of human subjects.

Henriquez and the research team's other biomedical engineers from Duke's Pratt School of Engineering are also working to miniaturize the components, to create wireless interfaces, and to develop different grippers, wrists, and other mechanical components of a neuroprosthetic device.

Share your comments

Have an account?

Sign in to comment

No Account?

Email the editor