Brain-Computer Interfaces Using EEG and c-VEPs: A Multiplayer Game Show Study

No time to read?
Get a summary

A team of researchers from the University of Valladolid in Spain has developed a brain-computer interface based on EEG readings that enables 22 participants to engage in a simple multiplayer video game. The study appeared in the journal Frontiers in Human Neuroscience (FHN). The system translates players’ intentions into in-game actions with high precision, delivering an average accuracy around 94 percent and progressing at a pace of roughly five seconds per move.

Brain-computer interfaces function by capturing and interpreting neural signals, typically through electrodes placed on the scalp. Those signals are transformed into commands that let people interact with computers, devices, or software using only their thoughts. This line of research holds promise for assisting people with movement limitations and for expanding how humans interact with digital environments.

One especially promising development in BCI technology is the use of co-modulated visual evoked potentials, or c-VEPs. This approach uses coded visual stimuli to trigger distinct neural responses, enabling more reliable and faster interpretation of user intent than some older methods. Researchers see c-VEPs as a key to boosting speed and accuracy while reducing calibration time for new users.

The study team set out to design, implement, and test a multiplayer video game that leverages c-VEP signals for control. The result was a game show format inspired by classic grid-style play, where players compete to connect four discs in a vertical board arrangement, with the first player achieving a complete line in any direction winning the round.

To collect brain signals, the participants wore a compact EEG headset with eight recording sites placed at specific locations on the scalp. A Bluetooth link connected the EEG hardware to the computer running the game, and calibration for each user took less than a minute. The rapid setup was essential for evaluating how easily new users could adopt the system in a short session with minimal training.

Across trials, the system demonstrated strong alignment between intended actions and game moves. Accuracy ranged from about 91 percent to 95 percent, with an average near 94 percent across all tasks. On average, participants were able to perform about 11 actions per minute, translating to roughly five seconds per action. User feedback highlighted the interface’s straightforwardness, quick learning curve, and responsive control, contributing to a favorable overall experience and willingness to continue testing the method [Source: Frontiers in Human Neuroscience].

Despite these encouraging results, the current configuration is not yet ready for broad deployment. A practical hurdle is the need for a conductive gel to optimize electrode contact with the scalp, which can complicate use outside controlled laboratory settings and may affect comfort for extended sessions. Researchers are actively exploring alternative electrode approaches and dry-contact solutions to simplify setup and enhance user experience for at-home or clinical use [Source: Frontiers in Human Neuroscience].

Earlier projects in this field have explored other pathways toward mind-controlled robotics, including helmet-based systems that allow users to steer robots or perform tasks through thought alone. While such demonstrations show the potential of BCI technology, real-world adoption continues to depend on improvements in signal reliability, user comfort, and practical wearability across diverse environments. The current work with c-VEPs contributes to that ongoing evolution by offering a high-accuracy, user-friendly approach that can be refined for broader applications in assistive technology and human-computer interaction [Source: Frontiers in Human Neuroscience].

No time to read?
Get a summary
Previous Article

Reparations Debate in Poland: Worries About Political Shifts and Agreement Scope

Next Article

Card Access for Foreigners in the Near Abroad: Kazakhstan, Belarus, Armenia, Kyrgyzstan, and Türkiye