Monkeys, using only brain and no brawn, controlled a virtual hand the monkeys could sense in a demonstration that puts researchers closer to melding brain and machine. The research continues to pave the way towards creating artificial limbs that have the sensation of a real limb.
Researchers at the Duke University Center for Neuroengineering trained two monkeys to use brain activity to move a virtual hand in research published Wednesday in the journal Nature.
Other researchers have created 'brain machine interfaces' where brain activity controls machines such as images on computer screens or prosthetic arms, but the current research gave the monkeys an edge: they could feel what they were virtually touching.
It has been recognized for a long time that a major weakness of brain machine interface systems was that they are unidirectional--from brain to controller, Paul Zehr, author of Inventing Iron Man: The Possibility of a Human Machine and director of the Centre for Biomedical Research at the University of Victoria in Victoria, Canada, wrote in an email. Zehr was not involved in the study.
This study makes a critical advance in actually providing patterns of electrical stimulation to the brain that mimicked sensory inputs in movement. This is a huge step forward for this field, Zehr wrote.
In the experiment, the Duke scientists wired two rhesus monkeys' brains and gave them the task of floating a virtual arm over a choice of three circles, one being the intended target.
Researchers made sure the three identical circles could be distinguished. Waving the virtual hand over each circle gave a different artificial sensation to each monkey, delivered by electrical stimulation of the brain.
Holding the virtual hand over the correct target for at least an eighth of a second gave a fruit juice reward - over an incorrect circle, the trial started over without a reward.
Someday in the near future, quadriplegic patients will take advantage of this technology not only to move their arms and hands and to walk again, but also to sense the texture of objects placed in their hands, or experience the nuances of the terrain on which they stroll with the help of a wearable robotic exoskeleton, Miguel Nicolelis, professor of neurobiology at Duke University Medical Center and co-director of the Duke Center for Neuroengineering, who was senior author of the study, said in a statement.
One monkey took four attempts and another nine attempts before learning to select the correct object during each trial. Several tests demonstrated that the monkeys were actually sensing the object and not selecting them randomly.
The remarkable success with non-human primates is what makes us believe that humans could accomplish the same task much more easily in the near future, Nicolelis said.
Video of the experiment below courtesy of Duke University: