Poster Session B, Sunday, March 24, 8:00 – 10:00 am, Pacific Concourse
Neural components of reading revealed by distributed and symbolic computational models
Ryan Staples1, William W. Graves1; 1Rutgers University - Newark
Explicit computational models could fill a critical need by constraining the interpretation of fMRI data. The domain of reading has two well-characterized computational models, both taking orthographic input and producing phonological output. The biologically-inspired connectionist model learns statistical, distributed relationships between inputs and outputs. In contrast, the dual-route cascaded model (DRC) is symbolic, connecting orthography to phonology with preprogrammed rules. Both models are supported by behavioral data, but have never been directly compared in the brain. Using representational similarity analysis, we compare the neural representation of orthography-phonology transforms generated by a connectionist model using graphemic and novel, graded orthographic inputs. We further provide the first comparison of connectionist and DRC models of reading in the brain. The correlation between all models (orthographic-graphemic, r=0.765; graphemic-DRC, r=0.537; orthographic-DRC, r=0.497) was high. All model dissimilarity structures were correlated to neural representations in a left-lateralized network spanning frontal, temporal, and occipital cortices, as well as a limited right-lateralized network. No differences were found in the neural representations of the two connectionist models, or when comparing the graphemic model with the DRC model. The connectionist model with graded orthographic inputs showed greater correspondence than the DRC model with activity in a subset of the areas described above. Our results show that a connectionist model fits the neural representation of words better than the DRC model when the inputs are orthographic. Overall, we have provided a computational approach to revealing the neural systems associated with specific cognitive components of reading.
Topic Area: LANGUAGE: Other