Linking Conceptual Knowledge of Emotions to Visual Representations of Facial Expressions Using a Reverse Correlation Approach with Genetic Algorithms
Poster Presentation: Saturday, May 17, 2025, 8:30 am – 12:30 pm, Pavilion
Session: Face and Body Perception: Emotion
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Émilie St-Pierre1 (), Jeanine Ohene-Agyei2, Arianne Richer1, Alexis Bellerose1, Francis Gingras1,3, Zohair Mharchat1, Camille Saumure4, Daniel Fiset1, Roberto Caldara4, Caroline Blais1; 1University of Quebec in Outaouais, Canada, 2University of Toronto, Canada, 3University of Quebec in Montreal, Canada, 4University of Fribourg, Switzerland
Facial expressions of emotion are crucial for effective social communication, yet remain subject to perceptual confusion. Previous research using a reverse correlation technique demonstrated that individual conceptual knowledge shapes and predicts visual representations of emotions (Brook & Freeman, 2018). However, reverse correlation methods require many trials and have been used to explore conceptual and visual relationships between pairs of affective states. Combining a genetic algorithm with reverse correlation to manipulate facial features on photorealistic avatars should offer a faster, more efficient, and naturalistic method for generating facial representations (Binetti et al., 2022). To probe this hypothesis, we used this approach to investigate the relationship between participants’ conceptual knowledge of six basic emotions and pain and their visual representations of facial expressions. Each participant completed a conceptual task, rating associations between these seven affective states and 45 word or phrase stimuli. This enabled the construction of individual-level similarity matrices, reflecting each participant’s unique patterns of conceptual overlap among these affective states. Participants also completed a reverse correlation task to capture their visual representations of facial expressions for the affective states in male and female faces. Perceptual overlap among visual representations was measured using two approaches. First, individual-level similarity matrices were generated based on the degree to which the representation of each expression shared similar activation patterns across 59 facial features. Then, a second set of similarity matrices was generated from ratings by 20 independent observers assessing the perceived intensity of each affective state. Inter-matrix correlation analyses examined the relationship between conceptual and perceptual similarity matrices. Permutation analyses confirmed that these correlations significantly exceeded chance level, providing robust evidence of a link between conceptual and perceptual similarity in facial expression representations. Crucially, our data show that this relationship was characterized by distinct individual differences, revealing a complex interplay between these two factors.
Acknowledgements: This study is supported by the Canada Research Chair in Cognitive and Social Vision (Caroline Blais, #CRC-2023-00019), NSERC Discovery Grant (Caroline Blais, #RGPIN-2019-06201), and the NSERC Graduate Scholarship Doctoral Program (Émilie St-Pierre, #ES D-590029).