Schedule of Events | Search Abstracts | Invited Symposia | Symposia | Poster Sessions | Data Blitz
Decoding Engagement in Live Lectures with EEG and Eye-Tracking
Poster Session B - Sunday, March 8, 2026, 8:00 – 10:00 am PDT, Fairview/Kitsilano Ballroom
Yimeng Wang1 (), Sidharth Anupkrishnan, Lisa D. Sanders; 1University of Massachusetts Amherst
Learning from instructors’ speech is a fundamental aspect of the classroom experience. Current techniques for comparing physiological measures to speech hold the promise of providing information about engagement and performance in classroom settings. However, these approaches have rarely been used with unscripted, continuous natural speech when the student/participant is free to move their heads and eyes. In the current study, a teacher delivered live lectures in engaging and boring styles about topics participants had rated as likely-to-be-interesting or not while participants were free to look at the speaker, the accompanying slides, or their own notes. We collected multimodal physiological data, including scalp EEG from a typical laboratory set-up, EEG from a mobile headband, and data from a head-mounted eye-tracker. Temporal Response Functions (TRFs) and ERPs elicited by acoustic onsets in speech were calculated, revealing similar P1–N1–P2 complexes and the strongest neural tracking for the most preferred lecture delivered in an engaging style. Ongoing analysis of the correspondence between speech and EEG, power in EEG frequency bands, and temporal and spatial features from eye-tracking measures are being used to predict students’ self-reported engagement levels and quiz performance. Current results suggest that EEG responses and eye-tracking measures successfully capture markers of engagement, and that neural responses related to speech features are influential in classification models. Together, this research highlights the importance of using live speech as stimuli, and directly comparing mobile and lab-based recordings, to pave the way to move beyond laboratory settings and investigate speech processing in natural settings.
Topic Area: ATTENTION: Multisensory
CNS Account Login
March 7 – 10, 2026