Poster Session E, Monday, March 25, 2:30 – 4:30 pm, Pacific Concourse
Learning an artificial sign language: An ERP study
Tania Delgado1, Jared Gordon1, Seana Coulson1; 1University of California, San Diego
Recent work in cognitive science has addressed how communicative pressures impact the structure of the language system to increase its learnability. One claim that has emerged from this work is that cultural evolution makes languages more amenable to the neural processing architecture. Here we address this claim by recording EEG as participants learned an artificial semiotic system that used gestures to convey concepts from six different domains (cooking, photography, beauty salons, church, prison, and concert halls). In each domain, there was a person (e.g. chef), location (restaurant), object (frying pan), and action (to cook). For half of the conceptual domains, signals were improvised pantomimes; signals for the other conceptual domains were derived from an artificial sign language that had been subject to cultural evolution. Taken from a previous laboratory study on iterated language learning, evolved signs included gestural markers that indicated whether its referent was a person, location, object, or action (Motamedi et al., 2016). In the EEG study, participants were shown videos of gestures that were either improvised or evolved. After each video, a word appeared and participants pressed a key to indicate whether the word matched or mismatched the previous sign. For both improvised and evolved signs, mismatches elicited larger amplitude N400 than matches (F(1,31)=34.4, p<.0005). However, mismatch effects were more prominent in evolved signs (F(28,868)=1.6, p<.05). This study provides a clearer view of how cultural transmission alters language to adapt to a learner’s brain.
Topic Area: LANGUAGE: Semantic