Poster E111, Monday, March 27, 2:30 – 4:30 pm, Pacific Concourse
Exploring the synchronization features of the sensorimotor integration of speech
M Florencia Assaneo1, David Poeppel1,2; 1New York University, Psychology Department, 2Max Planck Institute
Despite the long history of exploring the causes and consequences of the sensorimotor integration of speech, few experiments have tested in a principled manner how the link between motor and auditory areas may be mechanistically achieved. Here, we identify a basic architectural constraint of the auditory-motor circuitry for speech by exploring the synchronization properties of this system. An MEG protocol was developed to investigate the synchronization between the phases of the slow oscillations (2-7 Hz) in motor and auditory cortices. The experiment consisted in two main blocks. In the first one, a functional source localization protocol was employed to uncover each subject's speech-motor and auditory cortical regions. In the second, subjects were instructed to passively listening to a set of audio trials composed by strings of syllables synthesized at fixed rates. The MEG signals originating in the previously localized areas were extracted and evaluated for synchronization. The results showed that the coupling between motor and auditory brain activity increases as the heard syllable rate approaches to 4.5 Hz. Interestingly, the mean syllable rate across languages occurs at about the same frequency. Finally, numeric simulations revealed that a simple neural model (a standard Wilson-Cowan model for the motor cortex driven by auditory input) replicates the synchronization features between motor and auditory cortices.
Topic Area: PERCEPTION & ACTION: Multisensory