An Asian-based Database of Emotional Body Motion Induced by Diverse Daily Situations

Poster Presentation: Sunday, May 18, 2025, 2:45 – 6:45 pm, Banyan Breezeway
Session: Face and Body Perception: Body

Chiahuei Tseng1, Miao Cheng, Ken Fujiwara, Yoshifumi Kitamura; 1Research Institute of Electrical Communication, Tohoku University, Japan, 2Interdiscipinary ICT Research Center for Cyber and Real Spaces, Tohoku University, Japan, 3Department of Psychology, National Chung Cheng University, Taiwan

Emotion understanding in body movements is crucial for non-verbal communication, yet our progress is far behind our knowledge of facial expressions. One reason is the lack of a robust database that is representative enough to cover the complexity of emotional expression in the real world. Most current databases for bodily emotion expressions are culturally skewed to Western countries, and many are simple and repetitive actions (e.g. walking, knocking) focusing on parts of the body only. We expanded the current repertoire of human bodily emotions by recruiting Asian professional performers varied in personification information (e.g. gender, age, performing experience) to wear whole-body suits with 57 retro-reflective markers attached to major joints and body segments and express 7 basic emotions (joy, sadness, anger, fear, contempt, disgust, and surprise) and 5 social emotions (gratitude, jealousy, pride, shame, guilt) with whole-body movements in a motion capture lab. For each emotion, actors performed three self-created scenarios that cover a broad range of real-life events to elicit the target emotion within 2-5 seconds in 3 different intensities (low, medium, and high). No actor saw any others’ performances, ensuring that each actor's performance remained uninfluenced by others, thereby allowing for a wide range of interpretations and patterns of movements. After all the motion capture sessions were completed for each performer, we interviewed them to elaborate on their performing scenarios and acting strategies. We evaluated this database with human evaluation and found (1) that the emotion discrimination accuracy was comparable to other Western databases containing standardized performance scenarios; (2) own-race advantage in emotional recognition accuracy between Asian and non-Asian participants; (3) accurate situational information enhances body emotional understanding. The results suggest that a database using a novel emotional induction approach based on personalized scenarios will contribute to a more comprehensive understanding of emotional expression across diverse contexts.