Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age

Nina Suess, Anne Hauswald, Patrick Reisinger, Sebastian Rösch, Anne Keitel, Nathan Weisz

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)
87 Downloads (Pure)


The integration of visual and auditory cues is crucial for successful processing of speech, especially under adverse conditions. Recent reports have shown that when participants watch muted videos of speakers, the phonological information about the acoustic speech envelope, which is associated with but independent from the speakers' lip movements, is tracked by the visual cortex. However, the speech signal also carries richer acoustic details, for example, about the fundamental frequency and the resonant frequencies, whose visuophonological transformation could aid speech processing. Here, we investigated the neural basis of the visuo-phonological transformation processes of these more fine-grained acoustic details and assessed how they change as a function of age. We recorded whole-head magnetoencephalographic (MEG) data while the participants watched silent normal (i.e., natural) and reversed videos of a speaker and paid attention to their lip movements. We found that the visual cortex is able to track the unheard natural modulations of resonant frequencies (or formants) and the pitch (or fundamental frequency) linked to lip movements. Importantly, only the processing of natural unheard formants decreases significantly with age in the visual and also in the cingulate cortex. This is not the case for the processing of the unheard speech envelope, the fundamental frequency, or the purely visual information carried by lip movements. These results show that unheard spectral fine details (along with the unheard acoustic envelope) are transformed from a mere visual to a phonological representation. Aging affects especially the ability to derive spectral dynamics at formant frequencies. As listening in noisy environments should capitalize on the ability to track spectral fine details, our results provide a novel focus on compensatory processes in such challenging situations.

Original languageEnglish
Pages (from-to)4818-4833
Number of pages16
JournalCerebral Cortex
Issue number21
Early online date22 Jan 2022
Publication statusPublished - 1 Nov 2022


  • low-frequency speech tracking
  • MEG
  • multisensory processing
  • visual speech processing

ASJC Scopus subject areas

  • Cellular and Molecular Neuroscience
  • Cognitive Neuroscience


Dive into the research topics of 'Cortical tracking of formant modulations derived from silently presented lip movements and its decline with age'. Together they form a unique fingerprint.

Cite this