Identifying Distrust from Facial Expressions by Facial Action Coding System
Poster Presentation: Saturday, May 17, 2025, 8:30 am – 12:30 pm, Pavilion
Session: Face and Body Perception: Emotion
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Gabriel RongYang Lau1 (), Zihao Zhao1, Shuyi Sun1, Nicole Zhi Ee Ng1,2, Bee Chin Ng2, Hong Xu1; 1School of Social Sciences, Nanyang Technological University, 2School of Humanities, Nanyang Technological University
Previous research showed that social-behavioural cues can indicate trust. This raises the question of whether we can determine people’s trust from their facial expressions. In this study, we used the Facial Action Coding System (FACS) to identify facial Action Units (AUs) associated with trust toward social media news. Thirty-two participants viewed 192 randomized videos of speakers delivering news that varied by display type (monitor, hologram), emotion (neutral, happy, angry), attire (casual, doctor, nurse, police), gender (female, male), and content veracity (ambiguous, fake, real, no news). Participants rated the trustworthiness of the speaker and news while their facial expressions were recorded using a Logitech BRIO Webcam (except one participant for incomplete data). We coded the AUs displayed in each trial and conducted a Wilcoxon rank-sum test (for non-normal distribution and unequal sample size) to compare trust ratings between trials with and without displayed AUs. Participants were more likely to display the following AUs in brows and lip movements when they reported lower trust in both the speaker (W = 928,058, p < .001) and the news content (W = 625,781, p < .001): AU4 (Brow Lowerer), AU23 (Lip Tightener), AU28 (Lip Suck), AU20 (Lip Stretcher), AU1 (Inner Brow Raiser), AU12 (Lip Corner Puller), and AU15 (Lip Corner Depressor). Logistic regression analysis showed that the likelihood of AU display increased when participants viewed angry speakers (p < 0.001) sharing ambiguous (p = 0.022) or fake news (p < 0.001) on monitor display (p < 0.001). Our results highlight brow and lip movements as consistent indicators of distrust, influenced by news veracity, display mode, and speaker emotion. These distrust-related AUs may reflect heightened emotional arousal and increased cognitive effort in information processing. Our findings provide insights into the physiological expression of distrust and suggest potential applications for detecting trust using facial AUs.