Abstract

Self-motion along linear paths without eye movements creates optic flow that radiates from the direction of travel (heading). Optic flow-sensitive neurons in primate brain area MSTd have been linked to linear heading perception, but the neural basis of more general curvilinear self-motion perception is unknown. The optic flow in this case is more complex and depends on the gaze direction and curvature of the path. We investigated the extent to which signals decoded from a neural model of MSTd predict the observer’s curvilinear self-motion. Specifically, we considered the contributions of MSTd-like units that were tuned to radial, spiral, and concentric optic flow patterns in “spiral space”. Self-motion estimates decoded from units tuned to the full set of spiral space patterns were substantially more accurate and precise than those decoded from units tuned to radial expansion. Decoding only from units tuned to spiral subtypes closely approximated the performance of the full model. Only the full decoding model could account for human judgments when path curvature and gaze covaried in self-motion stimuli. The most predictive units exhibited bias in center-of-motion tuning toward the periphery, consistent with neurophysiology and prior modeling. Together, findings support a distributed encoding of curvilinear self-motion across spiral space.

Details

Title
Distributed encoding of curvilinear self-motion across spiral optic flow patterns
Author
Layton, Oliver W. 1 ; Fajen, Brett R. 2 

 Colby College, Department of Computer Science, Waterville, USA (GRID:grid.254333.0) (ISNI:0000 0001 2296 8213); Rensselaer Polytechnic Institute, Department of Cognitive Science, Troy, USA (GRID:grid.33647.35) (ISNI:0000 0001 2160 9198) 
 Rensselaer Polytechnic Institute, Department of Cognitive Science, Troy, USA (GRID:grid.33647.35) (ISNI:0000 0001 2160 9198) 
Publication year
2022
Publication date
2022
Publisher
Nature Publishing Group
e-ISSN
20452322
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2698377658
Copyright
© The Author(s) 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.