Autonomous brain-computer interface to enable robotic prosthetics
26 June 2019
Engineers at the University of Houston in the US have found that a brain-computer interface (BCI) could allow the development of robotic prosthetics with a more natural performance.
A form of artificial intelligence, the BCI analyses the interactions between single-neuron activities and the information flowing to these neurons to identify when its user is expecting a reward.
Researchers believe that these findings would allow creation of an autonomous BCI, which would learn about its subject and improve independently without the need for programming. In robotic prosthetics, such technology could sense the user’s intentions and carry out the task.
University of Houston biomedical engineering professor Joe Francis said: “This will help prosthetics work the way the user wants them to. The BCI quickly interprets what you’re going to do and what you expect as far as whether the outcome will be good or bad.”
Francis added that the team’s findings increase the ability to predict reward outcome from around 70% to 97%. The researchers used implanted electrodes to study brainwaves and changes in brain activity while tasks were performed. This was intended to provide insights into the modulation of interactions by conditioned reward expectations.
Francis noted: “We assume intention is in there, and we decode that information by an algorithm and have it control either a computer cursor, for example, or a robotic arm.”
According to the researchers, the BCI was even able to identify the intention for a task involving no movement because the neural activity pattern was similar to that during movement. Francis further commented: “This examination of reward motivation in the primary motor cortex could be useful in developing an autonomously updating brain-machine interface.”
Published by Verdict Medical Devices on June 13, 2019
Image from Shutterstock