
Researchers at Carnegie Mellon University have developed a robotic hand that can bend and move its fingers using nothing but human thought — no surgery, no implants required.
The breakthrough, achieved through noninvasive brain-computer interface (BCI) technology, marks a significant step forward in assistive robotics. Using electroencephalography (EEG) to record brain activity from outside the skull, participants were able to control the robotic fingers in real time by simply imagining specific movements.
Unlike traditional BCI systems that require surgical implants to achieve precise control, this approach is completely external, making it safer and more accessible. The system uses a deep-learning algorithm to decode the user’s brain signals and translate them into fine-motor commands for individual fingers — a level of control never before achieved with noninvasive methods.
“This is the first time we’ve seen continuous, finger-level control of a robotic hand using noninvasive brain signals,” said Professor Bin He, who led the study. “It opens the door to more natural, intuitive prosthetics and hands-free assistive technologies.”
The findings, published in Nature Communications, build on earlier work by He’s lab, which previously demonstrated EEG-controlled drones and robotic arms. The latest advancement moves the field closer to real-world applications, particularly for individuals with paralysis or limb loss.