Poster Session D, Monday, March 25, 8:00 – 10:00 am, Pacific Concourse
Brain Mechanisms for Processing Static and Dynamic Facial Expressions
Sing-Rong Sie1, Shih-Tseng T. Huang1,2, Yen-Ju Lu1; 1Department of Psychology, National Chung-Cheng University, Taiwan, 2Center for Research in Cognitive Science, National Chung-Cheng University, Taiwan
In the study, 25 college students (ages are between 20 and 25) participated and their functional magnetic resonance imaging (fMRI) data were collected when they watched static(images) and dynamic(videos) stimuli that included angry, fear, happy, neutral facial expressions, and non-human face objects. Participants were asked to view the stimuli presented at random sequence and pressed the bottom when they saw non-human face stimuli (i.e., objects). Analysis conducted by comparing each type of the facial expressions (including objects) to the neutral faces. Results found that for the static images, postcentral gyrus were more active when they were watching angry static face images, inferior temporal gyrus, superior parietal lobule, and inferior parietal lobule were active when watching static fear face images, hippocampus and thalamus were active when watching non-human objects. For the dynamic stimuli (videos), inferior parietal lobule and middle frontal gyrus were active when subjects watched happy face videos. And, precentral gyrus, calcarine sulcus, inferior parietal lobule, precentral gyrus, mid-cingulate cortex, supra-marginal gyrus, and insula were active when subjects viewing non-human face videos of objects. The results suggested differences in active brain areas in viewing static images versus dynamic videos. We further conducted analysis of autoencoder model by using the average of ICA components as the baseline and SVM personal model analysis. Results found the significant active brain regions in comparing each type of facial expressions with neutral faces of individual participants.
Topic Area: EMOTION & SOCIAL: Emotion-cognition interactions