Investigating visual weighting during postural control using virtual reality
Poster Presentation: Saturday, May 17, 2025, 8:30 am – 12:30 pm, Pavilion
Session: Multisensory Processing: Visual-haptic and visual-vestibular integration
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Yingying Bei1, Jeffrey Allen Saunders1; 1The University of Hong Kong
Multiple sensory cues, including vision, vestibular, and proprioception, are available to maintain balance when standing. We investigated the contribution of visual cues to posture control by measuring physical responses to continuous pseudorandom perturbations in virtual reality (VR). We varied the spectral composition of the perturbations to manipulate visual uncertainty. Optimal integration predicts reduced reliance on vision with higher uncertainty. On a trial, participants maintained a standing posture for 4.5 minutes in a virtual room that continuously rotated in the roll direction. In the baseline condition, perturbations were superimposed waves with low frequencies: .085 Hz, .115 Hz, .155 Hz. In the added-oscillation conditions, perturbations included an additional wave with frequency of .35Hz or .70 Hz. Spectral analysis of head movements in the baseline condition revealed detectable responses at the perturbation frequencies. In the added oscillation conditions, there was little or no response to the higher frequencies but the response to low frequencies was reduced compared to baseline. This suggests a reduced reliance on visual information when the visual uncertainty was increased by the additional oscillations. None of the participants reported motion sickness in any of the conditions. Our results demonstrate that a continuous psychophysics approach can measure visual contributions to posture control with short exposures that are not sickness-inducing and that the presence of higher-frequency oscillations can reduce responses to lower-frequency oscillations.