Facial emotion processing in naturalistic contexts

Poster Presentation: Saturday, May 17, 2025, 8:30 am – 12:30 pm, Pavilion
Session: Face and Body Perception: Emotion

Xiaoxu Fan1, Abhishek Tripathi2, Lily Chamakura1, Kelly Bijanki1; 1Baylor College of Medicine, 2Rice University

Facial emotion processing has been widely investigated using highly artificial controlled paradigms, yet it remains unclear how we interpret facial emotions in real life where multi-modal and contextual cues are all available and naturally interact. In this study, we analyzed neural data collected from human participants watching an audiovisual film, using AI models to extract 48 dynamic facial emotions and encoding models to evaluate neural representations of these emotional signals. We found that activity in dorsal lateral prefrontal cortex (DLPFC), anterior superior temporal cortex (aSTC) and posterior superior temporal cortex (pSTC) represent facial emotion in naturalistic contexts, however, in our data, children’s DLPFC does not encode these emotional features, suggesting a developmental difference in affective processing. Additionally, human voice appears to enhance the representation of facial emotion in pSTC, and alters the encoding of specific emotion categories in pSTC and aSTC. Taken together, our results reveal the coding of 48 facial emotions in ecologically valid settings, leveraging AI-based emotion analysis to advance our understanding of affective processing in the human brain.