Poster E86, Monday, March 26, 2:30-4:30 pm, Exhibit Hall C
Statistical learning of nonadjacent dependencies among different modalities
Yu-Huei Lian1, Kunyu Xu1, Denise H. Wu1; 1National Central University
Previous literature indicates that statistical learning (SL), the ability to detect regularities among adjacent elements, is a general mechanism for learning and for processing any type of sensory input that unfolds across time and space. However, whether SL is also possible and common when the dependent elements in different modalities are nonadjacent remain to be determined. We employed the SL tasks with triplets presenting nonadjacent regularities in the visual and auditory modalities to answer these questions. Specifically, relatively complex visual shapes and nonverbal environmental sounds were randomly organized to create triplets whose first and third elements were paired while the second element was variable in the visual and auditory SL tasks, respectively. After a familiarization phase, participants’ explicit and implicit knowledge of the nonadjacent regularities were measured by recognition and familiarity judgement, respectively. Although participants were capable of learning nonadjacent dependencies in both visual and auditory modalities with comparable accuracy, individual differences in the learning performance seemed to suggest that visual and auditory SL abilities rely on different mechanisms, as participants’ explicit SL of nonadjacent visual dependencies significantly correlated with their working memory, while their implicit SL significantly correlated with their IQ scores. On the other hand, neither explicit nor implicit SL of nonadjacent auditory dependencies correlated with any of the general cognitive abilities. Further research is needed to provide direct evidence for distinct mechanisms underlying SL of nonadjacent dependencies in different modalities.
Topic Area: OTHER