Abstract

Distance travelled is a crucial metric that underpins an animal’s ability to navigate in the short-range. While there is extensive research on how terrestrial animals measure travel distance, it is unknown how animals navigating in aquatic environments estimate this metric. A common method used by land animals is to measure optic flow, where the speed of self-induced visual motion is integrated over the course of a journey. Whether freely-swimming aquatic animals also measure distance relative to a visual frame of reference is unclear. Using the marine fish Rhinecanthus aculeatus, we show that teleost fish can use visual motion information to estimate distance travelled. However, the underlying mechanism differs fundamentally from previously studied terrestrial animals. Humans and terrestrial invertebrates measure the total angular motion of visual features for odometry, a mechanism which does not vary with visual density. In contrast, the visual odometer used by Rhinecanthus acuelatus is strongly dependent on the visual density of the environment. Odometry in fish may therefore be mediated by a movement detection mechanism akin to the system underlying the optomotor response, a separate motion-detection mechanism used by both vertebrates and invertebrates for course and gaze stabilisation.

A freely-swimming teleost fish reproduces a previously learned distance via visual motion information, demonstrating that its visual-motion processing is dependent on visual density.

Details

Title
Visual odometry of Rhinecanthus aculeatus depends on the visual density of the environment
Author
Karlsson, Cecilia 1   VIAFID ORCID Logo  ; Willis, Jay 1   VIAFID ORCID Logo  ; Patel, Matishalin 1 ; de Perera, Theresa Burt 1   VIAFID ORCID Logo 

 Research and Administration Building, Department of Zoology, Oxford, UK 
Publication year
2022
Publication date
2022
Publisher
Nature Publishing Group
e-ISSN
23993642
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2719948826
Copyright
© The Author(s) 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.