In an experiment, a woman with quadriplegia shaped the almost-human hand of a robot arm with just her thoughts, directing it to pick up big and small boxes, a ball, an oddly shaped rock, and fat and skinny tubes and showing that brain-computer interface technology has the potential to improve the function and quality of life of those unable to use their own arms.
The findings by researchers at the University of Pittsburgh School of Medicine, published (open-access) online Tuesday in the Journal of Neural Engineering, describe for the first time 10-dimension brain control of a prosthetic device in which the trial participant used the arm and hand to reach, grasp, and place a variety of objects.
“Our project has shown that we can interpret signals from neurons with a simple computer algorithm to generate sophisticated, fluid movements that allow the user to interact with the environment,” said senior investigator Jennifer Collinger, Ph.D., assistant professor, Department of Physical Medicine and Rehabilitation (PM&R), Pitt School of Medicine, and research scientist for the VA Pittsburgh Healthcare System.
In February 2012, small electrode grids with 96 tiny contact points each were surgically implanted in the regions of trial participant Jan Scheuermann’s left motor cortex that would normally control her right arm and hand movement.
Each electrode point picked up signals from an individual neuron, which were then relayed to a computer to identify the firing patterns associated with particular observed or imagined movements, such as raising or lowering the arm, or turning the wrist.
That “mind-reading” was used to direct the movements of a prosthetic arm developed by Johns Hopkins Applied Physics Laboratory.
To read the rest of this article, published in Kurzweil.net, please click here.