Poster E82, Monday, March 27, 2:30 – 4:30 pm, Pacific Concourse
Implicit memory for content and speaker of messages heard during slow-wave sleep
Simon Ruch1,2, Romi Zäske3,4, Marc Alain Züst1,2, Stefan R. Schweinberger3, Katharina Henke1,2; 1Department of Psychology, University of Bern, Bern, Switzerland, 2Center for Cognition, Learning and Memory, University of Bern, Bern, Switzerland, 3Department for General Psychology and Cognitive Neuroscience, Institute of Psychology, Friedrich Schiller University of Jena, Jena, Germany, 4Department of Otorhinolaryngology, Jena University Hospital, Jena, Germany
Although sleep is a state of unconsciousness, the sleeping brain does not completely cease to process external events. In fact, our brain is able to distinguish between sensical and nonsensical messages and can even learn contingencies between non-verbal events while asleep. Here, we asked whether sleeping humans can encode new verbal messages, learn voices of unfamiliar speakers, and form associations between speakers and messages. To this aim, we presented 28 sentences uttered by 28 unfamiliar speakers to participants who were in EEG-defined slow-wave sleep. After waking, participants performed three tests which assessed recognition of sleep-played speakers, messages, and speaker-message associations. Recognition accuracy in all tests was at chance level, suggesting sleep-played stimuli were not learned. However, response latencies were significantly shorter for correct vs. incorrect decisions in the message recognition test, indicating implicit memory for sleep-played messages (but not for speakers or speaker-message combinations). Furthermore, participants with excellent implicit memory for sleep-played messages also displayed implicit memory for speakers (but not speaker-message associations), as suggested by the significant correlation between response-latency-differences for recognition of messages and speakers. Implicit memory for speakers was verified by EEG at test: listening to sleep-played vs. new speakers evoked a late centro-parietal negativity. Event-related EEG recorded during sleep revealed that peaks resembling up-states of sleep slow-waves contributed to sleep-learning. Participants with larger evoked slow-wave peaks later showed stronger implicit memory. Overall, humans appear to be able to implicitly learn semantic content and speakers of sleep-played messages. These forms of sleep-learning are mediated by slow-waves.
Topic Area: LONG-TERM MEMORY: Episodic