Home > Using Brain Signals to Drive Prosthetic Devices

Using Brain Signals to Drive Prosthetic Devices

Scientists at the California Institute of Technology have taken an important step in the development of a strategy to use the higher-level neural activity of the brain to drive a prosthetic device. Such a strategy would allow paralyzed individuals to use their thoughts to move a device when they can not move their limbs. This project is funded by the National Eye Institute as a Bioengineering Research Partnership grant and has benefited significantly from the participation of engineers both at California Institute of Technology and The Jet Propulsion Laboratory.

For decades, neuroscientists have been working to develop prosthetic devices that are driven by brain signals. Most of this work has been focused on brain activity related to hand trajectory signals recorded from the area of the brain known as the motor cortex, which governs movement. However, because so many movement-related areas of the cortex converge into this one pathway, a prosthetic device based on motor cortical signals might only be able to perform one task at a time. Moreover, this approach will not work if motor cortical pathways are damaged by disease or injury.

A new, innovative approach by Richard Andersen, Sam Musallam and their colleagues at California Institute of Technology relies on brain signals that initiate movement based on sensory input. Using this method with trained monkeys, the investigators decoded brain signals related to reaching movements to position a cursor on a computer screen. This breakthrough was first conceived in the mid 1990s when Andersen and colleagues discovered a visual area of the brain called the parietal reach region (PRR) in the parietal cortex of monkeys. The PRR was found to be involved in planning motor movements based on preferences and goals. For example, if given the choice of reaching for an apple or an orange, the PRR would influence the movement based on the monkey’s taste preference for fruit. The abstract, high-level nature of the PRR precedes the lower-level brain activity related to motor cortical control of hand trajectories. In 2003, Andersen and collaborators from the University of Western Ontario discovered the PRR in humans.

The discovery of the PRR and its cognitive function led Dr. Andersen to consider creating a neural interface that could decode signals from PRR brain waves, allowing people with paralysis to manipulate prosthetic limbs or robotic devices with their thoughts. Andersen and Musallam created a specially designed, implantable multi-electrode device that connects a brain signal decoder to a computer cursor. The Cal Tech researchers implanted the device in the brains of monkeys and then trained the monkeys to position the cursor on a computer screen at a particular location without actually performing the physical movement.

The researchers created reward motivation experiments where the monkeys were given juice when they engaged PRR brain waves in anticipation of moving a cursor to a specific location on a computer screen. PRR neurons are able to hold a memory of reward and so through repeated cursor movement experiments, the monkeys’ PRR learned what movement led to reward. The researchers next manipulated the type, amount and frequency of the juice reward given to the monkeys in a repeated pattern. PRR cell activity was more active before the expected delivery of a preferred juice reward, allowing the researchers to obtain a value signal based on the monkey’s reward level. The PRR readings from the neural prosthesis allowed the Cal Tech researchers to decode the value signal to then move the computer cursor based solely on the signals of the PRR.

Until now no one has succeeded in tapping the messages of higher-order neurons involved in planning and motivation for potential use in prosthetics. Although much work exists, this exciting breakthrough offers proof of concept in decoding higher level brain signals to manipulate physical objects. Such an approach might make it possible to operate a number of devices such as robot limbs, wheelchairs, computers and even cars. Additionally, a wide range of higher order brain signals could be interpreted through a prosthetic device to give voice to patients who cannot speak by allowing them to merely think about what they would like to say.

Posted: February 2011