Poster C58, Sunday, March 26, 5:00 – 7:00 pm, Pacific Concourse
Phonological and semantic priming in American Sign Language: An ERP study
Brittany Lee1,2, Katherine J. Midgley1, Phillip J. Holcomb1, Karen Emmorey1, Gabriela Meade1,2; 1San Diego State University, 2University of California, San Diego
Although previous ERP studies have demonstrated N400 semantic priming effects for sign language, almost nothing is known about the neural dynamics of form-based priming or how semantic and phonological priming effects interact in a visual-gestural language. Using ERPs, this study investigated the time course of phonological and semantic priming in American Sign Language (ASL). Deaf ASL signers viewed pairs of ASL signs (with a 1300 ms stimulus onset asynchrony) and judged semantic relatedness. Half of the semantically unrelated signs were phonologically related and half were not. Phonological relatedness was defined as sharing two out of three phonological units: location, handshape and/or movement. Results show effects of both semantic and phonological relatedness. Target signs in semantically related pairs elicited faster responses and smaller amplitude N400s than those in semantically unrelated pairs, mirroring the classic semantic priming effects reported for spoken word recognition. Target signs in phonologically related pairs also elicited smaller amplitude N400s than targets in phonologically unrelated pairs, paralleling the N400 priming effects observed for rhyming words in spoken language. However, phonological overlap interfered with semantic relatedness judgments such that responses were slower for phonologically related pairs than for phonologically unrelated pairs. We hypothesize that similarity in form interfered with participants’ ability to reject a semantic relationship between the signs. Overall, the results indicate a similar time course of lexical access for signed and spoken languages and that spreading activation between form-related lexemes is modality independent.
Topic Area: LANGUAGE: Lexicon