Scientists from the Ecole Polytechnique Federale de Lausanne (EPFL) have demonstrated that auditory hallucinations can be induced through a robotic chair without the use of drugs. The findings were shared in a peer-reviewed article published in Psychological Medicine.
In an earlier set of experiments, researchers invited volunteers to sit in a chair and press a button. After a brief pause, a small finger-sized device gently touched their backs. With repeated sessions, participants began to feel as if a person stood behind them. The researchers proposed that this sensation stemmed from a mismatch between the action of pressing the button and the sensory feedback from the touch. In the follow-up study, the same mechanism was used to prompt auditory experiences instead of purely tactile ones.
The newer work followed a similar protocol with 48 volunteers seated in the chair who pressed a button. The back stimulation occurred after the button press, but the key variable was the timing between action and touch. To explore whether auditory phenomena could be elicited, headphones were worn by participants to deliver soft white noise. Some audio stimuli included spontaneous samples of the volunteers’ own voices or recordings of other people’s voices. Each participant was asked to report whether they perceived a sound coming from the headphones. The results showed that when a sense of a presence behind the person emerged, some participants also heard voices that were not externally present. The brain tended to generate a sound that cohered with the imagined “ghost” behind them. Interestingly, in a subset of cases where the back was touched immediately after the button press, voices were perceived only if a recording of the person’s own voice was played through the headphones [EPFL researchers, Psychological Medicine].
The study offers insight into how multisensory integration and the sense of agency contribute to perceptual experiences. When the timing between action and sensation is altered, the brain updates its predictions about the source of sounds, leading to the emergence of internally generated voices that feel real to the listener. This line of investigation underscores the brain’s capacity to construct auditory experiences from subtle cues and mismatches in sensory feedback, highlighting the powerful role of expectation and context in perception.
These experiments align with broader lines of inquiry into how somatosensory signals and auditory processing interact to shape conscious experience. By manipulating the sequence and timing of tactile input and auditory stimuli, researchers can probe the boundaries between perception, self-awareness, and the sense of location in space. The implications extend to understanding certain clinical phenomena where people report hearing voices or feel an external presence, offering a framework to study how perception can be swayed by the brain’s internal predictions rather than external inputs. Ongoing work in this area continues to explore how different sensory modalities influence each other and how individual differences in processing might modulate these experiences.
In related developments, scientists are examining how such multisensory mismatches might inform approaches to dementia risk assessment and cognitive health monitoring. By analyzing how people perceive and integrate combined tactile and auditory cues, researchers hope to identify patterns associated with cognitive decline and to develop noninvasive methods for early detection. The work also invites ethical discussion about the boundaries of perceptual manipulation in research, ensuring participants are informed and protected while exploring the mysteries of human perception [EPFL research notes, Psychological Medicine].