Spatially uninformative sounds modulate midbrain visual activity with and without primary visual cortical input
Poster Presentation: Sunday, May 18, 2025, 2:45 – 6:45 pm, Pavilion
Session: Multisensory Processing: Perception, neural, clinical
Schedule of Events | Search Abstracts | Symposia | Talk Sessions | Poster Sessions
Tatiana Malevich1,2, Matthias P. Baumann1,2, Yue Yu1,2, Tong Zhang1,2, Ziad M. Hafed1,2; 1Werner Reichardt Centre for Integrative Neuroscience, 2Hertie Institute for Clinical Brain Research, 3University of Tuebingen
We recently discovered that spatially uninformative sounds can activate otherwise dormant visual-motor pathways bypassing the primary visual cortex (V1). Here, we aimed to better understand how this might happen. We recorded from superior colliculus (SC) neurons (two monkeys) with either intact or focally inactivated V1 (muscimol microinjection; 1.5-2.5 μL; 10 mg/1mL). We presented a 0.2 deg radius disc within the neurons’ receptive fields (RF’s) and randomly interleaved trials in which we paired the visual stimulus onset with a bilateral sound pulse (50 ms; 1 kHz). This sound pulse was neither informative about the visual stimulus location nor spatially aligned with RF locations. With intact V1, SC neurons showed little, if any, responses to the sound alone. Nonetheless, visual response strength and latency were diversely affected (sometimes being stronger and earlier for the vision+sound trials, other times being weaker and later, and yet other times being unaffected). Such multisensory integration was also evident in local field potentials (LFP’s), with evoked responses to multisensory stimuli being enhanced and distinct from those elicited by unimodal stimuli in either modality. With inactivated V1, SC visual responses were much sparser, both at the single unit and LFP levels. However, adding spatially uninformative sounds unmasked a relatively weak visually-evoked LFP response that was not explained by sound-only responses. There were also few single units that exhibited clear multisensory integration in their spiking. Next, we sampled some inferior colliculus (IC) neurons. With intact V1, when IC neurons exhibited visual responses, they showed clear multisensory integration, even without having sound-only responses; however, LFP visually-evoked responses were predominantly driven by sound. With inactivated V1, IC single-unit and LFP responses were abolished. These results underscore the distinct SC and IC roles in multisensory integration, and they support a potential SC involvement in visually-guided behavior when V1 is compromised.