Investigating Non-Rigid Motion Perception with Multisensory Inputs: A Pilot Study

Poster Presentation: Sunday, May 18, 2025, 2:45 – 6:45 pm, Pavilion
Session: Motion: Local, higher-order, in-depth

Danica Barron1, Erin Conway, Troy Smith, Ralph Hale; 1University of North Georgia

Koerfer et al. (2024) found that non-rigid motion perception generates a motion signal that can be utilized by selective visual systems for perception and saccadic tracking but not for prediction or smooth pursuit. Our pilot study builds on this prior work to explore the impact of multisensory inputs on the perceptual stability of non-rigid motion. Using their novel, dynamic vortex stimulus, we examined whether auditory feedback influences visual perception and pursuit, as it does with rigid motion. The study involved undergraduate students who completed two counterbalanced blocks: a visual-only condition and a visual-audio condition. The vortex was composed of 8000 white dots on a black background displayed at a visual angle of 97.5° (width) x 55.9° (height). The vortex rotated at 45°/s, driven by an underlying mathematical model that simulated fluid-like motion. The stimulus was designed without distinct predictive cues, such as a clear leading edge, consistent directional markers, or static reference points. In the visual-audio condition, auditory feedback was delivered through noise-canceling headphones, with a panning, low-pitched tone synchronized to the vortex’s motion, designed to add spatial depth cues. Preliminary results indicate that auditory feedback may impact perceptual stability under complex motion conditions, though smooth pursuit mechanisms remain unengaged. This pilot study demonstrates the feasibility of the paradigm and highlights areas for refinement in preparation for subsequent studies. These findings contribute to a broader understanding of how the brain processes multisensory inputs in complex visual environments.