Using dynamic key depth to identify the emergence of diagnostic features
Undergraduate Just-In-Time Abstract
Poster Presentation: Sunday, May 18, 2025, 8:30 am – 12:30 pm, Banyan Breezeway
Session: Undergraduate Just-In-Time 1
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Mathias Salvas-Hébert1, Guillaume Lalonde-Beaudoin2, Ian Charest1, Daniel Fiset2, Caroline Blais2, Fréréric Gosselin1; 1Département de psychologie, Université de Montréal, 2Département de psychoéducation et de psychologie, Université du Québec en Outaouais
Psychophysics characterizes the relationship between experimental conditions and behavioral responses, offering insights into perceptual and decision-making processes. Among these responses, key presses are the most widely used. Traditionally discrete, key presses can now be tracked continuously using analog keyboards, providing fine-grained motor data. In this study, we leveraged Wooting analog keyboards to capture real-time responses with millisecond precision. Using Python and the tachypy (https://github.com/Charestlab/tachypy) and pyWooting libraries, we implemented a face classification task where participants judged emotional expressions (joy or fear) in stimuli partially revealed through sparse Gaussian apertures (“bubbles”). Five participants completed 592 trials, with the number of apertures dynamically adjusted to maintain 75% accuracy. Continuous response key depth data were recorded throughout each trial, allowing analysis of motor dynamics beyond traditional binary responses. As expected, facial regions associated with accurate classifications aligned with established findings. Interestingly, classification images derived from 10 ms bins of key positions (using the most depressed key within each bin as the response) revealed that the left eye region, from the observer’s point of view, drove responses as early as 500 ms — about 50 ms earlier than the right eye region.