Schedule of Events | Search Abstracts | Symposia | Invited Symposia | Poster Sessions | Data Blitz Sessions

Poster D24

Examining MEG visual mismatch responses to American Sign Language by hearing signers and non-signers

Poster Session D - Monday, April 15, 2024, 8:00 – 10:00 am EDT, Sheraton Hall ABC

Qi Cheng1 (, Yuting Zhang1, Tzu-Han Zoe Cheng1, Tian Christina Zhao1; 1University of Washington

Auditory mismatch responses (MMR) are commonly used in spoken language to examine automatic detections of linguistic anomalies/changes at various linguistic levels. Examining the spatial and temporal characteristics of visual MMR (vMMR) in sign languages can provide valuable insights on perceptual learning and its role on language development. Using Magnetoencephalography (MEG), the current study established a vMMR paradigm in American Sign Language (ASL) to examine the role of sign language experience on linguistic visual processing among hearing signers and non-signers. We identified one pair of real lexical signs (BOY, handshape=flat-B and location=forehead; KID, handshape=horn and location=nose) and switched the handshapes to create two non-signs (NS_1, handshape=horn and location=forehead; NS_2, handshape=flat-B and location=nose). We adopted an oddball paradigm where deviants are interspersed within standards about 15% of the time. In each block, the standards and deviants constitute a lexical vs. non-lexical contrast by changing the handshape but not the location (e.g., standard: BOY, deviant: NS_1). Participants are instructed to detect changes in the central cross while the signs were presented in the periphery to ensure preattentive processing of the signs. So far, we have gathered data from 9 hearing signers and 2 hearing non-signers. All hearing signers are proficient in ASL. Preliminary results suggest increased vMMR between 150-250ms in occipital and temporal regions with both lexical and non-lexical deviants for hearing signers, but not for non-signers. We aim to include 16 participants for each group and conduct ROI-based as well as exploratory whole-brain level analyses once the data collection is complete.

Topic Area: LANGUAGE: Other


CNS Account Login


April 13–16  |  2024