Representations of dynamic facial expressions are shaped by both emotional and social features

Poster Presentation: Tuesday, May 20, 2025, 8:30 am – 12:30 pm, Pavilion
Session: Face and Body Perception: Social cognition, behavioural

Hilal Nizamoğlu1,2, Fatma Celebi1, Katharina Dobs1,2; 1Justus Liebig University in Giessen, Germany, 2Center for Mind, Brain and Behavior, Universities of Marburg, Giessen and Darmstadt

Dynamic facial expressions play a crucial role in daily life by revealing emotions, social signals, and mental states of others. But what dimensions underlie humans’ perception of these expressions? To address this question, we measured the perceived similarity of a large-scale set of both emotional and conversational facial expressions and used representational similarity analysis (RSA) to predict this similarity space. Participants (N=19) performed a multi-arrangement similarity judgment task on 48 videos of four actors performing 12 predefined expressions: six emotional (e.g., ‘happily surprised’), and, crucially, six conversational (e.g., ‘disagree’). Across the stimulus set, we quantified emotional dimensions (valence, arousal, affectiveness), social dimensions (social relevance, friendliness), and motion features (e.g., head and facial part movements) using independent ratings. We then used these features, along with predefined expression and actor identity to predict participants’ similarity judgments. We found that all emotional and social dimensions, as well as expression, significantly predicted similarity judgments (FDR, q < 0.01), while motion features and identity had limited influence. Among the significant predictors, only valence, social relevance and expression uniquely contributed (p < 0.001) to the similarity judgments. These findings highlight that observers rely on both emotional and social dimensions to represent dynamic facial expressions.

Acknowledgements: This work was supported by DFG, Germany SFB/TRR 135 (grant number 222641018) TP C9 and S; and by the Research Cluster ‘‘The Adaptive Mind’’, funded by the Hessian Ministry for Higher Education, Research Science and the Arts.