Content area

Abstract

Sustained attention is essential for daily life and can be directed to information from different perceptual modalities, including audition and vision. Recently, cognitive neuroscience has aimed to identify neural predictors of behavior that generalize across datasets. Prior work has shown strong generalization of models trained to predict individual differences in sustained attention performance from patterns of fMRI functional connectivity. However, it is an open question whether predictions of sustained attention are specific to the perceptual modality in which they are trained. In the current study, we test whether connectome-based models predict performance on attention tasks performed in different modalities. We show first that a predefined network trained to predict adults’ visual sustained attention performance generalizes to predict auditory sustained attention performance in three independent datasets (N1 = 29, N2 = 60, N3 = 17). Next, we train new network models to predict performance on visual and auditory attention tasks separately. We find that functional networks are largely modality general, with both model-unique and shared model features predicting sustained attention performance in independent datasets regardless of task modality. Results support the supposition that visual and auditory sustained attention rely on shared neural mechanisms and demonstrate robust generalizability of whole-brain functional network models of sustained attention.

Author Summary: While previous work has demonstrated external validity of functional connectivity-based networks for the prediction of cognitive and attentional performance, testing generalization across visual and auditory perceptual modalities has been limited. The current study demonstrates robust prediction of sustained attention performance, regardless of perceptual modality models are trained or tested in. Results demonstrate that connectivity-based models may generalize broadly, capturing variance in sustained attention performance that is agnostic to the perceptual modality of model training.

Details

1009240
Title
Functional brain networks predicting sustained attention are not specific to perceptual modality
Publication title
Volume
9
Issue
1
Pages
303-325
Publication year
2025
Publication date
2025
Section
Research
Publisher
MIT Press Journals, The
Place of publication
Cambridge
Country of publication
US Minor Outlying Islands
e-ISSN
24721751
Source type
Scholarly Journal
Language of publication
English
Document type
Journal Article
Publication history
 
 
Milestone dates
2024-07-12 (Received); 2024-11-17 (Accepted); 2025-03-20 (Publication Date)
ProQuest document ID
3185249526
Document URL
https://www.proquest.com/scholarly-journals/functional-brain-networks-predicting-sustained/docview/3185249526/se-2?accountid=208611
Copyright
Copyright MIT Press Journals, The 2025
Last updated
2025-11-24
Database
ProQuest One Academic