Schedule of Events | Search Abstracts | Symposia | Invited Symposia | Poster Sessions | Data Blitz Sessions

Poster B140

How do we evaluate others’ memories?

Poster Session B - Sunday, April 14, 2024, 8:00 – 10:00 am EDT, Sheraton Hall ABC

Dr Talya Sadeh1, Avi Gamoran1, Lilach Lieberman1, Michael Gilead2, Ian Dobbins3; 1Ben-Gurion University of the Negev, 2Tel Aviv University, 3Washington University in Saint Louis

Humans have the highly adaptive ability to learn from others’ memories. However, because memories are fallible, for others’ memories to be a valuable source of information, we need to assess their veracity. Surprisingly little is known on how this is done. Previous studies have shown that information conveyed in self-reported memory justifications holds information which can be used to distinguish true from false recollections by modelling linguistic features of the text. But do humans process this information in the same way a model does? To address this question, we used justifications’ data collected by Dobbins & Kantner (2019), in which memory for word lists was examined using a yes/no recognition test, followed by written justifications for each recognition decision (e.g., “I remember repeating DOCTOR in my head because I remembered I had a doctor appointment”). These justifications, corresponding to Hits and False-Alarms, were presented to participants in a pre-registered, online study. Participants were asked to assess whether the witness’s recognition was correct or incorrect based on the memory justifications. Results show that human raters can discriminate Hits from False Alarms, above chance levels, based on these justifications. In doing so, raters rely on markers of recollective experiences in witnesses, having learned that these experiences (in themselves) signal accurate memory. Finally, results of this study show that features generated from humans’ assessments can augment machine-learning language models trained to classify memories.

Topic Area: LONG-TERM MEMORY: Episodic


CNS Account Login


April 13–16  |  2024