Poster Session E, Monday, March 25, 2:30 – 4:30 pm, Pacific Concourse
The Linguistic-Gestural Processing of Self-Adaptors, Emblems, and Iconic Gestures: An fMRI study
Kawai Chui1, Kanyu Yeh1, Ting-Ting Chang1; 1National Chengchi University, Taiwan
The study investigated the neural network for the processing of symbolic and non-symbolic meanings across speech and gesture. Self-adaptors, emblems, and iconic gestures are optimal to form a continuum of semantic distinctions in relation to the accompanying speech. The linguistic-gestural stimuli were presented in a setting that very much resembled that of daily conversation with a speaker and an addressee, a wide variety of clauses and gestures, and the participants not performing any specific task. The results showed common brain regions for both symbolic and non-symbolic gestures with stronger activations in right and left fusiform gyri, and for speech in bilateral superior temporal gyri. Differences were found, in that iconic gestures elicited greater activation than emblems in right and left fusiform gyri, precuneus, left superior parietal lobule, right supramarginal gyrus and left anterior cingulate cortex, whereas self-adaptors elicited increased levels of activation in bilateral superior parietal lobules than emblems. Altogether, the findings showed that the brain is sensitive to meanings presented across modalities (Holle et al., 2008; Xu et al., 2009). The brain regions involved in speech-gesture integration varied across types and cross-modal semantic relations. Gestures with symbolic meanings directly related to speech primarily recruited regions associated with visual processing and episodic memory since the gestural meaning was determined vis-à-vis the accompanying speech that they depicted different aspects of the same event. On the contrary, higher demands of visual feature discrimination were needed for processing non-symbolic gestures (Buccino et al., 2001).
Topic Area: LANGUAGE: Semantic