Dynamic And Flexible Feature Routing In Brain Pathways For Different Face Perceptions

Poster Presentation: Tuesday, May 20, 2025, 8:30 am – 12:30 pm, Pavilion
Session: Face and Body Perception: Features

Yuening Yan1 (), Jiayu Zhan2, Hui Yu1, Chen Zhou1, Oliver G. B. Garrod1, Robin A.A. Ince1, Rachael E. Jack1, Philippe G. Schyns1; 1School of Psychology and Neuroscience, University of Glasgow, 2School of Psychological and Cognitive Sciences, Peking University

Faces can be perceived differently based on distinct features: static 3D shape/complexion for identity (e.g. ‘Mary’) and transient movements (Action Units, AUs) for emotion (e.g. ‘Happy’). However, how the brain dynamically routes these features for identity recognition and emotion categorizations remains unknown. Using a generative model of the human face, we independently manipulated 3Dshape/complexion of six face identities and the AU features of six basic emotions (happy, surprise, fear, disgust, anger, sad). Participants first learned to identify the six identities and classify the six emotions with 100% accuracy. In the neuroimaging phase, participants viewed 3,6000 facial animations and categorized them according to identity, emotion or both (identity+emotion; N = 8 participants per condition). We recorded their MEG/behavioral responses during the task. All participants viewed identical facial animations. Information-theoretical analyses revealed where and when MEG source amplitudes represented identity and emotion features (replicated across all individual participants, p<0.05, FWER-corrected): 1. Emotion: Social pathway selectively routed representations of dynamic emotions and their individual AUs laterally to Superior Temporal Gyrus, with task-irrelevant identities briefly represented in Occipital Cortex (OC) 2. Identity: Occipito-ventral pathway selectively routed static 3D identity representations to Inferior Temporal Gyrus, with task-irrelevant emotions briefly represented in OC 3. Dual-task: Identities and emotions were routed separately via ventral and social pathways, demonstrating task-specific flexibility in feature processing These findings show that the brain dynamically and flexibly routes specific static and dynamic facial features via separate pathways depending on the perceptual task. When the same features are task-irrelevant, their representations are limited to early visual areas. Our study offers a novel framework to understand how the brain computes 4D social information critical for socio-emotional perception and decision-making.

Acknowledgements: This work was funded by the Wellcome Trust (Senior Investigator Award, UK; 107802) and the Multidisciplinary University Research Initiative/Engineering and Physical Sciences Research Council (USA, UK; 172046-01) (P.G.S); ERC [FACESYNTAX; 759796] (R.E.J); the Wellcome Trust [214120/Z/18/Z] (R.I.).