EDIA: An open-source toolbox for virtual reality-based eye tracking research using Unity

Poster Presentation: Tuesday, May 20, 2025, 8:30 am – 12:30 pm, Banyan Breezeway
Session: Eye Movements: Natural or complex tasks

Felix Klotzsche1,3 (), Jeroen de Mooij2, Sven Ohl3, Michael Gaebler1; 1Max Planck Institute for Human Cognitive and Brain Sciences, Leipzig, Germany, 2thefirstfloor.nl, Rotterdam, Netherlands, 3Humboldt-Universität zu Berlin, Germany

Virtual reality (VR) is increasingly used in vision science experiments, with recent headsets featuring built-in eye tracking proving particularly valuable for researchers. However, these headsets are not research-grade, can require time-intensive software development, and raise questions about the quality of their eye tracking data for scientific purposes. The EDIA toolbox addresses these challenges by supporting vision scientists to design and conduct experiments with eye tracking in VR, using the Unity game engine. A key feature of EDIA is its integration with multiple commercially available VR headsets. This enables user-friendly data extraction from built-in eye trackers at their native temporal resolution, all accessible via a standardized interface. EDIA provides tools for logging, visualizing, and streaming (eye tracking) data, and allows experimenters to easily build reusable components. This flexibility facilitates hardware switching and reusing code across experiments and setups. EDIA supports data streaming via the LabStreamingLayer network protocol, enabling synchronization with external data sources such as EEG. Additionally, it offers remote monitoring and control of experimental applications, a feature particularly relevant for mobile VR setups. For experiments implemented using EDIA, researchers can alter relevant study parameters (e.g., the design matrix) using platform-independent configuration files. This allows researchers and their assistants to customize experimental sessions without modifying Unity code. A validation module enables researchers to monitor and compare eye tracking data quality throughout an experiment, across studies, and between different devices. We demonstrate EDIA in a study evaluating eye tracking performance in five state-of-the-art VR headsets: three mobile models (Meta Quest Pro, PICO 4 Enterprise, HTC Vive Focus 3) and two tethered devices (HTC Vive Pro Eye, Varjo Aero). Using an empirical sample of 24 participants, we compare the headsets' spatial accuracy, precision, and latency (relative to concurrent electrooculography) under conditions with and without head movements.

Acknowledgements: This research was supported by the cooperation project between the Max Planck Society and the Fraunhofer Gesellschaft (grant: project NEUROHUM).