Development and validation of a new dynamic facial expression database

Poster Presentation: Saturday, May 17, 2025, 8:30 am – 12:30 pm, Pavilion
Session: Face and Body Perception: Emotion

Sophia Lipetzky1, Susan G. Wardle1, J. Brendan Ritchie1, Chris I. Baker1, Shruti Japee1; 1National Institute of Mental Health

Facial expressions are essential for effective interpersonal communication and social interaction. Most prior research aimed at understanding how humans process the expressions of others has used static images of racially homogenous, highly posed facial expressions as stimuli, which have low real-world relevance. To improve the ecological validity of stimuli used in facial expression research, we developed the Facial and Body Movement Database (FaBMoD) consisting of video recordings of twenty-three racially diverse professional actors making a variety of facial expressions. Each actor made nine different facial expressions oriented in seven different directions. To validate these expression videos, participants (n=10) completed a computer task where they were shown each of the 204 front-facing videos and asked to identify the facial expression. Participants were also asked to rate each video on a scale of 1-9 for intensity, genuineness, and valence. Results indicate good overall correspondence for most videos between the intended expression and the expression perceived by participants. This was especially the case for videos depicting happiness and sadness, while videos intended to depict fear were sometimes perceived as surprise. Intensity ratings for most emotional expressions were reasonably high, with participants predominantly rating videos between 6 and 8 on a scale from 1 (not intense at all) to 9 (very intense). By contrast, videos depicting sadness or neutral expressions were generally rated as less intense. Genuineness ratings showed a similar trend with emotional expressions being rated as quite genuine, while neutral expressions were typically rated as less genuine. This new dynamic facial expression database along with the validation data will be used in our future studies of facial expression processing, as well as shared with other researchers world-wide.