The role of torsion during self-motion
Poster Presentation: Tuesday, May 20, 2025, 8:30 am – 12:30 pm, Banyan Breezeway
Session: Eye Movements: Natural or complex tasks
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Andrés H. Méndez1 (), Cristina de la Malla1, Joan López-Moliner1; 1Vision and Control of Action Group, Institute of Neurosciences, Universitat de Barcelona, Catalonia, Spain
Rigorously tracking eye and head behavior in space is key for constructing realistic models of the stimuli that reaches our eyes and elucidating the neural mechanisms underlying visually guided behaviors. The 3D orientation and movement of the eyes in the head and the head in the world not only determine the line of sight but also generate the moment-to-moment patterns of visual flow that stimulate the retina. While horizontal and vertical eye movements have been characterized in real settings, eye torsion remains largely unexplored outside the lab, and its relevance for visual perception has often been overlooked. Here, we leveraged head-mounted technology to measure torsional eye movements during locomotion. We asked ten subjects to wear a head-mounted device in a static condition and while walking and fixating a distant target. We combined manual with automatic coding to extract eye torsion and compared eye azimuth, elevation and torsion with head yaw, pitch and roll, respectively. We then applied optic flow algorithms to the raw head-centered and corrected retina-centered videos to describe the consequences of eye counter-roll in the retinal flow. We show that eye torsion effectively compensates for head roll during locomotion, altering the incoming visual flow in ways that could be relevant for the extraction of self-motion parameters.
Acknowledgements: This work was supported by grant PID2023-150081NB-I00 to JLM and PID2023-150883NB-I00 and CNS2022-135808 to CM, funded by MICIU/AEI/10.13039/501100011033 and the European Union NextGenerationEU/PRTR. AM was supported by grant PRE2021-097688 funded by MICIU/AEI/10.13039/501100011033 and by the FSE+