Poster Session C, Sunday, March 24, 5:00 – 7:00 pm, Pacific Concourse
Neural dynamics of human auditory perception across space and time
Matthew Lowe1, Yalda Mohsenzadeh1, Benjamin Lahner1, Santani Teng1,2, Ian Charest3, Aude Oliva1; 1Massachusetts Institute of Technology, 2Smith-Kettlewell Eye Research Institute, 3University of Birmingham
Neuroimaging methods such as functional magnetic resonance imaging (fMRI) and magnetoencephalography (MEG) have afforded insight into the spatial and temporal mechanisms governing human audition, separately. Yet we still lack fundamental knowledge of simultaneously where and when auditory representations emerge in the human brain. Comparing data across different imaging modalities, however, can provide an integrated account of perception across space and time. Here we demonstrate how the processing of sounds from various sources unfolds across space and time in the human brain. Participants (n = 15) listened to 80 real-world sounds while we acquired MEG and fMRI data in independent sessions. First, by applying multivariate pattern classification to MEG, we illuminate the rapid emergence of individual sound identity beginning within 80 ms of sound onset. Second, using representational similarity analysis we correlate whole-brain MEG and fMRI data to reveal the temporal dynamics of sound processing in the human brain. Refining this analysis, we examine independently-localized regions of interest in occipital, temporal, and frontal cortex to map their distinctive spatiotemporal dynamics. Our results reveal hierarchical organization of sound processing emerging with neuronal activation in primary auditory cortex and spreading rapidly across a distributed neural network within the first few hundred milliseconds of audition. Together, these findings elucidate the differential temporal dynamics for representations of individual sounds and provide an integrated account of auditory processing across space and time in the human brain.
Topic Area: PERCEPTION & ACTION: Audition