Researchers Explore Half Movements to Improve Brain–Computer Interfaces
A new approach from scientists in Russia introduces a novel way to control virtual or robotic hands by tapping into brain signals. The development, announced by the Russian Science Foundation and reviewed by multiple science outlets, adds a fresh perspective to how brain–computer interfaces can translate thought into action.
In recent years the field has grown rapidly as researchers pursue reliable methods for people to interact with computers and assistive devices. Brain–computer interfaces aim to let users guide a computer cursor or operate prosthetic limbs, whether in virtual environments or as real-world devices. The overarching goal is to give individuals with paralysis or limited mobility greater freedom to engage with their surroundings. A common method records brain activity with electroencephalography while participants imagine performing a task, enabling researchers to link brain signals to specific actions.
The Moscow State University of Psychology and Education team proposes a distinctive technique: instead of purely picturing movement, participants perform what may be described as partial or half motions. In this approach a person begins a movement and then gradually reduces its amplitude until there is no actual movement, yet the brain continues to generate signals associated with the intended action. In a study involving 23 young volunteers, researchers compared three conditions — half-movements, imagined movements, and actual movements. Volunteers first learned to abduct a limb in real life, in thought, and in partial motion, after which their brain activity was monitored with EEG. A post-task survey captured the volunteers’ subjective experiences during each task to gauge familiarity and ease of use.
Results showed that from the participant’s point of view the intention to produce a quasi-movement felt almost identical to the intention to execute a real movement. In EEG readings, brain activity during half-movement was more pronounced than during pure imagination, while imagined motions tended to produce larger signal amplitudes. The researchers also noted that occasional attempts to perform half-movements could cause subtle, nearly imperceptible muscle tensing, which might influence the tests by introducing small motor activity that, in turn, alters EEG patterns.
The team believes that harnessing half-movements could lead to a brain–computer interface that more accurately interprets user commands and is easier to learn. Such an interface would provide smoother control of prosthetic limbs and greater ease in daily tasks, potentially speeding up clinical applications and broadening the reach of assistive technology for people with mobility challenges.
Looking ahead, the researchers plan to refine the decoding algorithms that translate neural signals into control commands. They also aim to validate the approach with larger and more diverse participant groups to ensure reliability across different ages, levels of motor ability, and neurological backgrounds. The goal is to establish a robust framework for natural, dependable operation of neuroprosthetic devices, ultimately improving quality of life for individuals facing mobility limitations and strengthening the broader collaboration between humans and machines in rehabilitation and daily assistance.