Turning Thoughts into Action for Millions of Prosthetic-dependent People
University of Maryland Associate Professor of Kinesiology José ‘Pepe’ Contreras-Vidal wears his Brain Cap, a non-invasive, sensor-lined cap with neural interface software that soon could be used to control computers, robotic prosthetic limbs, motorized wheelchairs and even digital avatars. Courtesy of John Consoli, University of Maryland |
“If you can imagine it, you can achieve it.” An inspirational saying, yes. But prosthetic limbs that amputees may directly control with their brains and that will allow them to feel what they touch? Science fiction?
“There’s nothing fictional about this,” said Maria O’Malley of Rice University.
O’Malley is one of four principal investigators from four U.S. universities funded by the National Science Foundation embarking on a four-year program to design such prosthetics. Brent Gillespie from the University of Michigan, Jose Contreras-Vidal from the University of Maryland and Patricia Shewokis from Drexel University round out the collaborative research team.
“These researchers have developed a unique approach,” said NSF Human-centered Computing program manager Ephraim P. Glinert. “The team will build upon their prior work to design and validate non-invasive neural decoders that generate agile control in upper limb prosthetics. This is exciting, and will have broad implications for potentially millions, including many recent war veterans, who face daily challenges associated with prosthetic limbs.”
O’Malley agrees. “Researchers have already demonstrated that much of this is possible. What remains is to bring all of it — non-invasive neural decoding, direct brain control and haptic sensory feedback — together in one device.”
Contreras-Vidal, who is an associate professor of kinesiology, and his team have created a non-invasive, sensor-lined cap that forms a “brain computer interface” that could control computers, robotic prosthetic limbs, motorized wheelchairs, and even digital avatars. The team has published three major papers on their technology over the past 18 months, the latest a just-released study in the Journal of Neurophysiology in which they successfully used EEG brain signals to reconstruct the complex 3-D movements of the ankle, knee and hip joints during human treadmill walking.
In two earlier studies, they showed similar results for 3-D hand movement, and they also showed subjects wearing a brain cap could control a computer cursor with their thoughts.
Contreras-Vidal’s previously demonstrated technology allowed test subjects to move a cursor on a computer screen simply by thinking about it. Non-invasively tapping into the user’s neural network using a brain cap of electrodes that read electrical activity on the scalp via electroencephalography (EEG) made this discovery possible.
The team plans to combine this EEG information with real-time data about blood-oxygen levels in the user’s frontal lobe using functional near-infrared (fNIR) technology developed by Shewokis at Drexel.
Shewokis said, “We want to provide intuitive control over contact tasks, and we’re also interested in strengthening the motor imagery the patients are using as they think about what they want the arm to do. Ideally, this tactile, or haptic, feedback will improve the signal from the EEG and fNIR decoder and make it easier for patients to get their prosthetic arms to do exactly what they want them to do. We are moving toward incorporating the ‘brain-in-the loop’ for prosthetic use and control.”
O’Malley said the new technology is a big leap over what’s used in existing prosthetic devices, which don’t allow amputees to feel what they touch. Some state-of-the-art prostheses today use force-feedback systems that vibrate — much like the ‘vibrate’ mode on a mobile phone — to provide limited information about objects a prosthetic hand is gripping.
“Often, these vibrotactile cues aren’t very helpful,” O’Malley said. “Many times, individuals simply rely on visual feedback — watching their prosthesis grasp an object — to infer whether the object is soft or hard, how tightly they are grasping it and the like. There’s a lot of room for improvement.”
Gillespie said, “This truly unique team has been given the opportunity to help solve the challenging problem of brain-to-machine interface. I’m excited about our breakthroughs and the promise for future results. We are approaching the dilemma with big respect for the brain/body connection and hope to discover methods to harness the body in new ways.
“Sensory feedback, especially haptic feedback, is often overlooked, but we think it’s the key to closing the loop between the brain and motorized prosthetic devices,” he said. “These results indicate that we stand a very good chance to help amputees and also help others who may be suffering from motor impairments.”
Glinert is hopeful for even broader impacts, “This research will revolutionize the control and interface of upper limb prosthetics. The work will lead to a better understanding of the role of sensory feedback in brain-computer interfaces and will lay the foundation for restoration of motor and sensory function for amputees and individuals with neurological disease.”
View a Rice University video of prosthetic arm at www.youtube.com/watch?v=z1oIsXqc0U4