Action boosts vision, especially in tough tasks

Poster Presentation: Sunday, May 18, 2025, 8:30 am – 12:30 pm, Pavilion
Session: Action: Perception and recognition

Gizem Y. Yildiz1 (), Elena Witzel1,2, Katja Fiehler2, Bianca M. van Kemenade1; 1Center for Psychiatry, Justus Liebig University, Giessen, Germany, 2Experimental Psychology, Justus Liebig University, Giessen, Germany

Every action we perform triggers the brain to generate predictions using copies of motor commands, known as efference copies. These action-predictions are then compared with incoming sensory information, shaping our perceptions. Action-predictions usually lead to sensory attenuation. However, in the auditory domain, it has been shown that while attenuation occurs for clearly audible action outcomes, this effect reversed to enhancement for near-threshold stimuli. It remains unclear whether similar effects occur in the visual domain. To investigate whether action-predictions influence visual perception differently based on stimulus visibility, we asked participants to perform an orientation discrimination task on tilted gratings presented at supra-threshold (high-visibility) and sub-threshold (low-visibility) levels under active and passive viewing conditions. In the active viewing conditions, participants had to press the button indicated by an auditory cue in order to elicit the presentation of the gratings. In the passive viewing conditions, the gratings were presented automatically following the auditory cue. Experiment 1 demonstrated that both the discrimination thresholds and the slopes were affected by the visibility of the grating. The discrimination thresholds were lower, and the slopes were higher in the high-visibility trials compared to the low-visibility trials. Although no differences were observed between the active and passive viewing conditions, post-hoc analysis indicated a trend toward significance in the accuracy difference between the two conditions, specifically in challenging comparison trials within high-visibility blocks. In Experiment 2, we manipulated task difficulty to create blocks of challenging and easy comparisons within high- and low-visibility conditions under active and passive viewing. The results showed that participants were significantly more accurate in the active trials than in the passive trials of high-visibility blocks, especially when the task was more difficult. We conclude that action-predictions enhance perception primarily in situations where visual stimuli are less noisy, but the task is more challenging.

Acknowledgements: Deutsche Forschungsgemeinschaft (DFG, German Research Foundation)–SFB/TRR 135, Project A10