Schedule of Events | Search Abstracts | Invited Symposia | Symposia | Poster Sessions | Data Blitz

Modelling Strength-Based Representations in Memory Using Spiking Neural Networks

Poster Session C - Sunday, March 8, 2026, 5:00 – 7:00 pm PDT, Fairview/Kitsilano Ballroom
Also presenting in Data Blitz Session 4 - Saturday, March 7, 2026, 10:30 am – 12:00 pm PST, Salon F.

Patrick Tsapoitis1 (), Jakeb Chouinard1, Myra Fernandes1; 1University of Waterloo

Reading words aloud, compared to silently, during encoding may enhance memory by strengthening representations. Computational modelling enables manipulation of representation strength and can be used to simulate memory. Our model, Encoding-based Strength Representation in Memory (ESRM), was created using the Nengo Python library. Nengo uses the Neural Engineering Framework to enable mapping cognitive models onto spiking neural networks. ESRM includes simplified hippocampal memory components that account for primacy, recency, and decay. It reinforces ‘read aloud’ words by creating an additional distinct, and general memory representation. Recall in ESRM is determined by neural populations competitively accumulating evidence of word representations to a threshold, using a previously established minimum of 30% to filter irrelevant evidence. An accumulation to threshold denotes successful recall. Results from 50 simulations with ESRM were compared to behavioural data (N=50) in which participants were presented with a set of words, one at a time, and asked to either read them aloud or silently, intermixed within-subjects. A 2 (Source: ESRM vs Human) x 2 (Cue: aloud vs. silent) mixed ANOVA revealed a significant effect of Cue, better memory for words read aloud than silently, but no effect of Source. We then compared memory under conditions with or without noise, to mimic real-world divided attention conditions. ESRM again replicated human data: the production benefit from reading aloud versus silently remained, though it was reduced in magnitude. ESRM can successfully be used to model the effectiveness of encoding techniques, expanding applicability to populations where attention may be compromised (aging, MCI, TBI).

Topic Area: LONG-TERM MEMORY: Other

CNS Account Login

CNS_2026_Sidebar_4web

March 7 – 10, 2026