Poster C80, Sunday, March 26, 5:00 – 7:00 pm, Pacific Concourse
Gesture Comprehension and Verbal Working Memory
Seana Coulson1, Ying Choon Wu1, Megan Bardolph1, Tania Delgado1; 1University of California, San Diego
Behavioral research on the comprehension of co-speech iconic gestures has produced conflicting accounts of the import of verbal working memory (WM). Here we use scalp recorded event-related brain potentials (ERPs) to examine this issue. Electroencephalogram (EEG) was recorded as 14 healthy adults engaged in multimodal discourse comprehension and a concurrent verbal memory task. Each trial began with participants hearing either 1 (low load) or 4 (high load) digits to be remembered. The discourse comprehension task involved watching a video of a speaker describing everyday objects, and using either congruent or incongruent co-speech gestures. Comprehension was tested by asking participants whether a photograph presented after each video was either related or unrelated to the scene described by the man in the video. After responding to the photograph, participants indicated the numbers they had heard at the beginning of the trial in the order that they heard them. ERPs time-locked to the onset of the first word in each video revealed a larger P600 component 500-700ms to words accompanied by incongruent than congruent gestures. ERPs time-locked to the last word in each video revealed no effects of gesture congruity, and a slow-rising anterior negativity beginning 500ms post-stimulus that was larger for low than high load trials. Results point to independent effects of verbal memory load and speech gesture congruity, consistent with models that suggest a minimal role for verbal WM in gesture comprehension.
Topic Area: LANGUAGE: Semantic