Scientists create robotic arm that can be moved using your imagination — all the details revealed
Res𒅌earchers at the University of California, San Francisco, have enabled a paralysed man to regularly control a robotic arm using signals from his brain, transmitted via a computer.
He was ꦉable to grasp, move, and release objects simply by imagining himself performing the action🐭s. The device, known as a brain-computer interface (BCI), functioned successfully for a record seven months without requiring any adjustments.
Until now,෴ such devices had only worked for a day or two.
This BCI relies on an artificial intelligence (AI) model that adapts to small changes in brain activity as a person repe𒐪atedly imagines a movement, gradually improving its accuracyജ.
“This blending of learning between humans and AI is the next phase for these brain-computer interfaces,” said Professor Karunesh Ganguly, a neurologist at UCSF Weill Institute for Neurosciences. “It’s what we need to achieve sophisticated, lifelike function.”
The study, funded by the US National 𝕴Institutes of Health, was published on 6 M🏅arch in the journal Cell.
One of the study participants, who l�ജ�ost the ability to move and speak following a stroke years ago, can now control the robotic arm by imagining specific movements.
The key breakthrough involved understanding how brain activity shifts from day to day when the participant repeatedly imaginesꦐ making these movements.
Once the AI sys🌠tem was trained to account for these changes, it maintained performa🦋nce for months at a time.
Professor Ganguly previously studied brain activity patterns i✤n animals and observed that🎉 these patterns evolved as the animals learned new movements.
He suspected the same process w𝓰as occurring in humans, which explained why earlier BCIs quickly lost their ability to interpret brain signals.
Ganguly and Dr. Nikhilesh Natraj, a neurology researcher, worked with a participant who had been paralysed by a▨ stroke and could neither move nor speak.
The participant had tiny sensors implanted on the surface of his brain to detect n🦩e🌸ural activity when he imagined moving.
To investigate whether these brain patterns changed over timeꦰ, the particip𒐪ant was asked to imagine moving different body parts, such as his hands, feet, and head.
While he could not phไysically move, his brain continued t🐲o generate signals corresponding to these imagined movements.
Keep up with today’s most important news
Stay up on the very latest wiꦛth Evening Update.
Thanks for signing up!
The BCI recorded these signals and found that while the general patter🦋ns remained the same, their precise locations in the brain shifted slightly each day.
The researchers then asked the participant to imagine simple finger, hand, and thumb movements over two weeks while the AI system learned to interpret his brain activity. Initially, the robotic arm’s movements were imprecise.
To impr🌃ove accuracy, the participant practiced using a virtual robotic arm that provided feedback on how closely his imagined movements matched the intended actions.
Eventually, he was able to get the virtual arm to perform the desired tasks. Once the participant began practising with the real robotic arm, it only took a few practice sessi🎉ons for him to transfer his skills to the real world. He was able to use the robotic arm to pick up blocks, turn them, and move them to new locations.
He was even able to open a cabinet, retrieve a cup, and hold it under a water dispenser. Months later, he could still control the robotic arm after a brief 15-minute “tune-up” to adjust for changes in his brain activity over time.
Ganguly and his team are now working to refine the AI model to make the robotic arm move faster and more smoothly. They also plan t🥀o test the system in a home environment. For people♍ with paralysis, the ability to perform simple tasks like feeding themselves or getting a drink of water could be life-changing.
“I am very confident that we have learned how to build the system now, and that we can make this work,” Ganguly said.