The interplay between attention and audio-visual object storage in working memory
Poster Presentation: Saturday, May 17, 2025, 2:45 – 6:45 pm, Pavilion
Session: Multisensory Processing: Audiovisual integration
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Ceren Arslan1 (), Daniel Schneider1, Stephan Getzmann1, Edmund Wascher1, Laura-Isabelle Klatt1; 1Leibniz Research Centre for Working Environment and Human Factors, Dortmund, Germany
To navigate a multisensory world, the brain must integrate and maintain information from multiple senses. However, current working memory (WM) research primarily focuses on the visual domain. Thus, our knowledge of how multisensory information is encoded and maintained remains incomplete. Here, we conducted two EEG experiments to elucidate how audio-visual objects are stored in WM using an audio-visual delayed-match-to-sample task. In Experiment 1, participants were presented with audio-visual objects while attending to auditory features (attend-auditory trials), visual features (attend-visual trials), or both (conjunction trials). Behavioral results showed that task-irrelevant features interfered at recall, suggesting bottom-up encoding into WM. However, traditional ERP measures of unisensory WM load showed they were not actively maintained. When the task required a deliberate feature integration (conjunction trials), there were greater attentional demands, indicated by stronger alpha power suppression at recall. In Experiment 2, we added unisensory controls (auditory- and visual-only) and manipulated the spatial arrangement of tones and orientations to be compatible or disparate. Performance declined in the attend-visual condition with spatially disparate features, while attend-auditory and conjunction conditions were unaffected by spatial manipulation. Representational similarity analysis indicated that task-irrelevant tones (in attend-visual trials) were reactivated at recall, while task-irrelevant orientations (in the attend-auditory trials) were more consistently filtered out. Contrasting the behavioral effects, this pattern was unaffected by the spatial compatibility between auditory and visual features. In sum, these findings highlight the complex dynamics of selective attention in modulating multisensory feature integration and audio-visual object storage in working memory.