Varano E, Thornton M, Kolossa D, Zeiler S, Reichenbach T (2026)
Publication Type: Journal article
Publication year: 2026
Book Volume: 325
Article Number: 121654
DOI: 10.1016/j.neuroimage.2025.121654
Humans comprehend speech in noisy environments more effectively when they can see the talker’s facial movements. While the benefits of audiovisual (AV) speech are well established, the specific visual features that support this enhancement and its underlying neural mechanisms remain unclear. Here, we examine how simplified facial signals that preserve structural and dynamic information affect AV speech-in-noise comprehension as well as neural speech tracking. In a behavioural experiment, participants viewed natural or progressively simplified facial videos while listening to short sentences in background noise. Visual stimuli included natural facial recordings, coarse facial outlines, and a simple geometric analogue of visual speech—a disk whose radius oscillated with the speech envelope. In an EEG experiment, we assessed how the progressively simplified visual signals influenced cortical tracking of the speech envelope during continuous AV speech. Behaviourally, we found that comprehension improved with increasing visual detail, while the disk provided no AV benefit, underscoring the importance of dynamic facial cues. For the EEG experiment, only the most natural visual signals enhanced delta-band (1–4 Hz) temporal response functions (TRFs) relative to audio-only stimulation, peaking around 180 ms. This neural enhancement correlated with behavioural benefit across participants. Theta-band effects were weaker and less consistent, suggesting a more limited role in AV integration. Together, these findings highlight the importance of facial detail in AV speech perception, with natural visual input driving stronger delta-band tracking and potentially reflecting alignment of auditory processing with word-level visual cues.
APA:
Varano, E., Thornton, M., Kolossa, D., Zeiler, S., & Reichenbach, T. (2026). Delta-band cortical speech tracking predicts audiovisual speech-in-noise benefit from natural and simplified visual cues. NeuroImage, 325. https://doi.org/10.1016/j.neuroimage.2025.121654
MLA:
Varano, Enrico, et al. "Delta-band cortical speech tracking predicts audiovisual speech-in-noise benefit from natural and simplified visual cues." NeuroImage 325 (2026).
BibTeX: Download