Poster B132, Sunday, March 26, 8:00 – 10:00 am, Pacific Concourse
Which way: Neural decoding of spatial directions in images, schemas, and words
Steven Weisberg1, Steven Marchette1, Anjan Chatterjee1; 1University of Pennsylvania
Often, when we navigate we interpret spatial directions in different formats; for example by reading a map or a list of directions. How do neural representations bridge across these formats to reach a common understanding? Our ability to seamlessly transition between these formats suggests that some brain regions may have a common code for directions across different formats. Research on prepositions and actions has revealed distinct regions involved in processing words, schemas, and images (Amorapanth et al, 2012; Watson et al, 2014). Here, we used multivoxel pattern analysis in an fMRI experiment to test the hypothesis that distinct regions of the brain decode spatial direction in these formats. In a continuous carryover sequence (Aguirre, 2007), we presented spatial directions one at a time. Subjects (n = 20) determined the spatial direction, but only responded to catch trials, which were discarded. Searchlight analyses revealed separate decoding of images and schemas, but not words, in parahippocampal and occipital place areas (PPA, OPA) – scene-specific regions of cortex (Epstein, 2014). To further determine the coding properties of spatial directions, we tested a visual angle model (angular distance between directions) and an egocentric model (angular distance between directions calculated across the front: sharp left and sharp right most dissimilar). Both models fit the pattern of activity in bilateral OPA, but the egocentric model fit better than the visual angle model. Our results suggest that OPA decodes visuospatial, but not verbal, representations of spatial directions, which allows us to interpret maps and signs.
Topic Area: PERCEPTION & ACTION: Vision