A new research paper has integrated robotics with neuroscience to decode information from the brain to an exoskeleton, and then send sensory information back to the brain in real time.
The paper was authored by Jun Morimoto, head of department of Brain Robot Interface and ATR Computational Neuroscience Labs, and Mitsuo Kawato, director of ATR Brain Information Communication Research Laboratory Group.
The researchers integrated computational neuroscience, brain-motivated robotics and brain-machine interface (BMI) to devise real-time brain-to-exoskeleton information transmission.
“A human volunteer wears a whole-body exoskeleton robot. Her/his brain activity is measured and decoding is carried out in real time. The decoded brain information is used to influence the robot control algorithms to realise brain-to-robot information transmission,” the researchers explained in their paper Creating the brain and interacting with the brain: an integrated approach to understanding the brain.
“Because the robot is attached to a human body, robot motion generates multi-modal sensory feedback to the brain by implementing robot-to-brain information transmission.”
The benefits of this kind of technology are mostly in the health industry where patients with disorders or injuries could use it to retrain themselves to carry out specific physical functions.
“In recent years, it has been found that using brain activity to control a robotic assistive system is also useful to help stroke patients recover their motor functions.
To read the rest of this article, published in Tech World.com.au, please click here.