Functional brain network dynamics capture context- and modality-general fluctuations in sustained attention

Poster Presentation: Sunday, May 18, 2025, 8:30 am – 12:30 pm, Pavilion
Session: Attention: Neural mechanisms

Anna Corriveau1,2, Jin Ke1,2,3, Monica D. Rosenberg1,2,4; 1Department of Psychology, The University of Chicago, 2Institute for Mind and Biology, The University of Chicago, 3Department of Psychology, Yale University, 4Neuroscience Institute, The University of Chicago

Sustained attention is supported by distributed brain networks. However, while sustaining focus in the real world requires maintaining multimodal information, studies of the brain networks underlying this ability have almost exclusively examined visual attention. Here, we test whether common neural mechanisms underlie sustained attention across visual and auditory modalities, and across controlled and naturalistic tasks. In a two-session fMRI study, participants performed a continuous performance task in which streams of trial-unique sounds and images were presented simultaneously. They were instructed to attend either images or sounds and press a button when the relevant item belonged to a frequent (90%) but not infrequent (10%) category. To isolate brain networks tracking attention dynamics, we parcellated fMRI data into 400 cortical and 32 subcortical regions and calculated pairwise edge co-fluctuation time series, or the product of z-scored activation time series, between all pairs of brain regions. General linear models were fit to identify pairs of brain regions whose co-fluctuations predicted lapses in sustained attention. Analyses revealed edges whose dynamics were related to visual (5465 positive, 6678 negative edges) and auditory (1849 positive, 2401 negative edges) attention lapses, as well as edges common to both types of errors (~6.4% of selected edges, overlap significant p<.001). We then tested whether fluctuations in the strength of these edges predicted subjective fluctuations in attention to narratives. We correlated edge co-fluctuation strength with continuously-reported narrative engagement during four naturalistic stimuli: two audiovisual movies, one silent movie, and one podcast. While predictions from edges related to visual attention lapses were unreliable, the subset of edges involved in both visual and auditory attention lapses predicted changes in engagement in all narratives (Pearson’s rs=.014-.055, all ps<.08). Results suggest that sustained attention relies on modality-general networks which capture attentional fluctuations across contexts.

Acknowledgements: National Science Foundation BCS-2043740 (M.D.R.)