Introduction
Multisensory integration should occur when two stimuli share a plausible common causation. Visual motion is ambiguous in that it can represent external object motion or self-motion through a fixed environment. For visual and inertial cues to be integrated, they must have self-motion as the common causation. Perception of heading direction is an ecologically relevant sensory-based ability in which visual and inertial cues are normally integrated [1–4]. How common causation is determined has been shown to depend on both spatial [5] and temporal alignment [6–10]. However, results to date have suggested that visual and inertial stimuli do not need consistent velocity and acceleration profiles [11]. The current study challenges that assumption by examining visual and inertial stimuli with very different motion profiles.
Close temporal alignment of the stimuli is a factor that is important, closer timing yields more robust multisensory integration [6–10]. This has been shown by our group: When visual and inertial stimuli are more than about 250 ms out of temporal alignment, minimal sensory integration occurs, and when they are more than 500 ms out of temporal alignment, they aren’t integrated at all [12]. This happens despite significant temporal overlap between the visual and inertial stimuli with these offsets, suggesting that visual motion does not only need to occur during the inertial motion but also needs to have plausible common causation with self-motion based on the inertial stimulus. In these previous experiments, the peak velocity and acceleration were not modified independently of the beginning and end of the stimuli, so it is unclear if the most relevant factor was differences in onset time, end time, time of peak acceleration/velocity, or another factor.
When visual motion is consistent with self-motion, it is more likely to be interpreted as such. This has been studied in the context of visual stimuli presented alone where a more compelling sense of vection (i.e., the illusory perception of self-motion induced by a visual stimulus while the observer is stationary) occurs when there is an acceleration component and not just a constant velocity [13]. This is presumably because acceleration is more consistent with inertial motion. Furthermore, this study demonstrated that the sensation of vection becomes most compelling after the acceleration component subsides. Because the subject is stationary there was no inertial acceleration present, consistent with the vection sensation being dependent on its consistency with inertial motion. In a follow-up study further supporting the integration of visual acceleration, standing individuals reported consistently stronger vection when acceleration was superimposed on constant radial expansion or contraction [14].
Some degree of temporal alignment between visual and inertial motion should be required for visual-inertial multisensory integration [15], but which specific features of the stimuli need to be aligned are unclear. It seems likely that in addition to the start and end points of the visual and inertial motion being similar, the alignment of the velocity and/or acceleration profile could also be important. In a previous study in which subjects were asked to judge if a stimulus was straight ahead or offset, it was found that some subjects did not integrate the visual stimulus because it was thought to be directionally inconsistent with the inertial stimulus [16]. However, a subsequent study that varied the visual motion profile demonstrated that visual-inertial integration occurred equally well and in a statistically optimal manner when the velocity profile of the visual stimulus was either a constant velocity or when it matched the inertial stimulus [11]. The differences between these may be the much longer duration stimulus (around 10s) used in the earlier study. A subsequent study looked at the effect of stimulus duration and found that longer stimuli tended to weigh the visual component of heading more [17]. These longer duration stimuli will have lower acceleration (closer to the perceptual threshold) thus, the inertial component likely had lower reliability, which could explain why the visual component had a higher influence on heading perception.
For the current study, we focus on 2s duration visual and inertial stimuli. This duration was used because it fits within the requirements of what can be comfortably and reliably delivered in our laboratory while avoiding potentially confounding velocity storage from longer movements. This is also the same duration that was used to previously show that temporal alignment is required [12] and somewhat paradoxically, a constant velocity visual stimulus is integrated the same as a velocity-matched visual stimulus [11], although in that study, the total duration was shorter (1s). The Butler, et al. study compared the fixed velocity visual stimulus with a velocity profile that matched the inertial stimulus; however, it did not try to vary the timing of the velocity and acceleration profile within the visual stimulus remained invariant.
The current study tests the hypothesis that features within the visual stimulus need to be consistent with the inertial stimulus for multisensory integration to occur. This was done by comparing the visual influence on inertial heading perception for stimuli where the motion profile of the visual and inertial stimuli are matched and when the visual stimulus velocity and acceleration were inverted. Both visual stimuli correspond to the same distance covered in space. The purpose of this study was not to look at causal inference [18] as it would have been obvious to most observers that the inverted visual stimulus was inconsistent with the inertial motion. We previously observed that visual stimuli bias inertial direction perception even with large offsets of up to 90° when it was clear to subjects that the stimuli were artificially offset [5]. Thus, the range of offsets used in the current study went to 75°. We hypothesized that there were three possible ways in which modification of the visual stimulus may influence multisensory integration: 1) It could narrow the angular deviation of visual and inertial stimuli in which integration could occur. This was tested by looking at ±45, ± 60, and ±75° offsets between inertial and visual headings. If this were to occur it would imply that both heading disparity and velocity profile were considered in determining visual-inertial common causation. 2) The influence of the visual stimulus could be less. If this were the case the perceived heading direction would be closer to the inertial heading direction when the visual stimulus didn’t match. 3) The integration could occur independently of the velocity profile of the visual stimulus, suggesting congruent path endpoints may be more relevant. In other words, the variation in the visual stimulus profile does not affect the inertia heading perception.
Methods
Ethics statement
The research was conducted according to the principles expressed in the Declaration of Helsinki. A written informed consent form was approved by the University of Rochester Research Science Review Board before the study was conducted. All subjects participated in the experiments between July and October of 2022.
Human subjects
Twenty healthy subjects (13 female) were enrolled in the experiments. The mean age was 20 ± 4 (mean ± SD, the range was 18–31). The oldest subjects were #2 (31), and #20 (29). Eleven subjects were 18 years old. All subjects reported vision that was normal or corrected to normal.
Equipment
A 6-degree-of-freedom (6-DOF) motion platform (Moog, East Aurora, NY model 6DOF2000E) was used to deliver the inertial motion stimuli. The setup is common in human motion perception studies [19] and has been described previously for heading estimation experiments [4,12,20,21]. A 55” color LCD screen with a 1920 x 1080 pixel resolution delivered the visual component. The screen was mounted to the motion platform and set 50 cm from the subject, filling a 117° horizontal field-of-view. The subjects wore a helmet attached to the motion platform while they sat in an automotive-style racing seat. In this study, the head movement was not measured separately from the platform as significant decoupling was felt to be unlikely for the type of motion used. As detailed previously [22], an audible white noise was reproduced from two platform-mounted speakers on either side of the subject to mask sound from the platform.
Stimuli
The visual component was formed by a star field in which each star was a 0.5 cm tall yellow triangle whose scale was adjusted at the plane of the screen to simulate distance as previously described [12]. A visual refresh rate of 60 Hz was used with 30% of points randomly repositioned in each frame (i.e., 70% visual coherence). This decreased coherence was used to encourage subjects to focus on the inertial stimulus. No fixation point was provided, the lights were switched off, and subjects faced the screen displaying the visual stimulus.
Acceleration-matched visual and inertial stimuli were both presented with a single 2s (0.5Hz) sine wave, in acceleration motion profile. The movement corresponded to 15 cm of displacement, which had a peak acceleration of 25 cm/s/s and peak velocity of 15 cm/s, which was more than an order of magnitude higher than human motion perception thresholds [23,24]. The inverted visual stimulus also lasted 2s (Fig 1) and had a similar peak acceleration (Fig 1A), peak velocity (Fig 1B), and displacement (Fig 1C). The difference between the normal and inverted visual stimulus was that the inverted stimulus began and ended at the peak velocity while the normal stimulus achieved the peak velocity at the midpoint (Fig 1B). However, both stimuli were consistent with the same displacement, and had the same peak velocity and peak acceleration.
[Figure omitted. See PDF.]
It also included stimuli in which the velocity profile of the visual stimulus was inverted such that it started and ended at the peak velocity (15 cm/s) while slowing to zero in the middle (inverted, dashed line). Thus, the inverted stimulus was inconsistent with the inertial motion experience.
Each trial block consisted of twelve randomly interleaved staircases. In half of these staircases, the inertial stimulus started 50° to the right, and in the other half it was 50° to the left. Both staircases could pass through zero in the opposite direction. At 50° offsets, all subjects were reliably able to identify the inertial heading direction as left or right, independent of the visual offset. Each of these six staircases in each heading direction were divided into three with an inverted visual stimulus and three with a normal visual stimulus. Within each set, there was one with the visual stimulus offset to the right, one aligned with the inertial stimulus (no offset), and one offset to the left. Each staircase included 15 stimulus presentations. Offsets of ±45, ± 60, and ±75° were tested in different trial blocks in an order that was randomly determined for each subject. There was masking noise during each stimulus presentation. Subjects were asked to report the direction of the inertial stimulus as left or right of straight ahead. For staircases that started with a stimulus 50° to the left, after each leftward response, the next stimulus was shifted 8° to the right. After a rightward response, the step size was decreased by half (e.g., from 8° to 4°) and shifted to the left. With subsequent reversals stimuli step sizes could be reduced to a minimum of 1° or increased after three responses in the same direction. For the corresponding staircase that started to the right, there was a similar adjustment. Each staircase could step through zero. This resulted in the majority of stimuli late in the staircase being focused near the point of subjective equality (PSE) at which subjects were nearly equally likely to respond with left or right. If no direction was entered within 2s no response was recorded, and the stimulus was presented again the next time that staircase was active. These types of lapses were rare occurring less than 1% of the time.
Analysis
The fraction of rightward responses for each stimulus level was plotted as a function of the heading direction tested for both the normal velocity (Fig 2A-2C) and inverted velocity (Fig 2D-2F). The PSE was determined by fitting a normal (Gaussian) cumulative distribution function using the fit function in Matlab (version R2019b) for data collected from otherwise similar staircases that started in opposite directions (Fig. 2). This determined the mean of the psychometric function (also the PSE) at which responses were equally likely to be reported in each direction. The slope of the psychometric function (i.e., sigma or standard deviation) was also determined. In the current study, sigma denotes the slope of the psychometric function in an individual while standard deviation will be used to describe the variation around the mean in a population. In the example shown, offsetting the normal visual stimulus 60° to the left (Fig 2A) tended to bias the perception of inertial direction such that an inertial stimulus had to be shifted 10.3° to the right to be perceived as neutral (i.e., straight ahead). The opposite bias was seen when the visual stimulus was shifted 60° right (Fig 2C). Visual offsets produced smaller biases with the inverted velocity stimulus (Fig 2D and 2F). In both cases, the zero-offsets produced very small biases (Fig 2B and 2E).
[Figure omitted. See PDF.]
± 60° offset. With normal velocity visual stimuli (panels A-C), the visual stimulus influenced the perceived inertial heading. Thus, for a visual stimulus shifted to the left. a straightforward inertial heading (0°) is more likely to be perceived as left, which corresponds to the cumulative distribution function being shifted to the right (Panel A). For this subject, the inverted stimulus (panels D-F) had a minimal effect on inertial heading perception. Each small circle represents an individual stimulus presentation, larger circles represent multiple stimulus presentations in proportion to their diameter.
Statistics were performed with JMP for the Macintosh (version 18.2.0). A two-way analysis of variance (ANOVA) was done using the three non-zero offsets (45, 60, and 75°) and the two visual stimulus types (normal and inverted). A one-way ANOVA followed by a post-hoc analysis were performed for analysis of heading offsets and effects among participants. A Student’s paired T-test with alpha specified at 0.05 for the two types of visual stimulus profile.
Results
Heading perception was biased in the direction of the offset visual stimulus and the mean (bias) and sigma were determined from individual responses (Fig 2). When the aggregate data was examined, the inverted visual stimulus biased inertial perception less than the velocity matched stimulus (Fig 3). When the stimuli in which the visual stimulus was offset (i.e., aligned stimuli were excluded) the mean bias towards the visual stimulus was 6.0 ± 1.0° (mean ± SE) with inverted visual stimulus and 9.4 ± 1.2° (mean ± SD) with a normal visual stimulus. A three-way ANOVA was performed using visual stimulus type (normal, inverted) and the three non-zero offsets (45, 60, 75°). This (ANOVA, F(1,38) = 9.61) revealed that visual stimulus type is the only statistically significant factor (p = 0.0022) with no effect of offset magnitude (F(2, 38) = 0.28, p = 0.76). There was a significant effect between subjects (ANOVA, F(6,19) = 13.2, p < 0.0001) A paired T-test of inverted vs. normal visual stimulus was statistically significant t(118) = 6.487, p < 0.0001 (two-tailed) when all the non-zero offsets were considered. When the offsets were considered separately, the effect remained statistically significant at 45° t(39) = 3.04, p = 0.004, 60° t(39) = 4.18, p = 0.0002, and 75° t(39) = 4.58, p < 0.0001. Given that the size of the offset did not have a significant effect, the non-zero offsets were combined to show the individual data so that the effect of the visual stimulus could be seen for each of the twenty subjects (Fig 4).
[Figure omitted. See PDF.]
The data shown represent all twenty subjects. For no visual offset (zero) the bias is shown as towards the right. When the visual stimulus was offset, the bias was shown towards the visual stimulus so that leftward and rightward biases could be shown together. Error bars represent ±1 SEM. P-values are calculated using a paired T-test and are shown for each offset.
[Figure omitted. See PDF.]
Only non-zero visual offsets are included. Bias is plotted towards the visual stimulus offset and represents both leftward and rightward biases. Error bars represent ±1 SEM, which was calculated across all visual offsets (±45, ± 60, and ±75°) collected in that subject.
For 17 of the 20 participants, the acceleration-matched visual stimulus biased perceived heading direction more than an inverted velocity stimulus (Fig 4). However, the amount of bias varied between subjects, as did the amounts of relative bias. This variation between subjects was statistically significant (ANOVA, F = 4.4, df = 19, p < 0.0001). There was one subject (#17) with an unusually large bias, while in two subjects (#1 and #8), the visual stimulus of either type had effectively no influence on inertial heading direction. Two subjects had a bias with the normal visual stimulus but not with the inverted one (#2, 20). In an additional 4 subjects, the bias induced by the normal stimulus was at least twice that of the inverse visual stimulus (#3, 5, 10, 18). In most of the remaining subjects, there was a modest bias towards the visual stimulus, which was greater with a normal visual stimulus than an inverted one.
The slope of the psychometric function (sigma) was also determined (Fig 5). When the potential effects of visual stimulus type and offset were examined there were no statistically significant effects using a three-way ANOVA to look at stimulus type and non-zero offsets. When the visual stimulus profile was examined (ANOVA, F(1,38) = 0.36) revealed that the stimulus type (p = 0.55) was not a significant factor. When offset magnitude was examined (ANOVA. F(2,38) = 0.20) there was also no significant effect (p = 0.82). As with the bias, there was significant variation in sigma between subjects (ANOVA, F(6,19) = 3.38, p < 0.0001). Overall, the normal stimulus had a sigma of 6.1 ± 0.4 (SEM) and the inverted stimulus had a sigma of 5.6 ± 0.4 (SEM), but this difference was not statistically significant t(179) = -0.94, p = 0.35. There was also no effect of the size of the visual offset (ANOVA, F = 0.16, df = 3, p = 0.92). Sigma was not correlated with the magnitude of bias towards the visual stimulus (R2 = 0.009, p = 0.07).
[Figure omitted. See PDF.]
All visual offsets were included (including zero) as there was no significant difference based on offset size. Error bars represent ±1 SEM.
Discussion
It was previously reported that visual and inertial heading stimuli were optimally integrated even when the visual stimulus did not have a motion profile consistent with the inertial motion [11]. In that study, the visual stimulus was inconsistent with inertial motion because of a constant velocity, and beyond that, the type of inconsistency was not explored. This finding was somewhat surprising to the current authors as other forms of inconsistency, such as large directional offsets between stimuli [5] and timing differences beyond about 200 ms [12] diminish visual-inertial multisensory integration. Furthermore, when visual stimuli are presented alone, they have a more compelling sense of vection when there is an acceleration component and are not just a fixed velocity [13], suggesting an accelerating visual stimulus is more likely to be perceived as self-motion. Unpublished data from our group confirmed the findings of Butler et al, using a 2s stimulus, suggesting the finding was not an artifact of the 1s stimulus used or other parameters that may have been specific to the prior study.
During the common situation of motion through a static environment, visual and inertia cues have a tight causal relationship that allows sensory integration to occur [1,25–27]. However not all visual motion corresponds to self-motion, for example, walking through blowing snow or moving with a crowd. Visual motion alone should not be interpreted as self-motion, but there are situations when this occurs including the false perception of self-motion induced by visual motion or vection [13,28–32] and visually induced motion sickness [33,34].
When common causation between two sensory modalities is not plausible, they should not be integrated. In the current study, even though the visual stimulus isn’t plausible as occurring due to inertial motion, there was still integration but the influence of the incongruent visual stimulus was diminished to. on average, 64% of the congruent one. There were two subjects (#2, 20, Fig 4) in whom the normal visual stimulus biased inertial perception, and the inverted visual stimulus did not, so in most subjects, even the inverted visual stimulus biased inertial perception. Interestingly, subjects 2 and 20, aged 31 and 29 respectively, are the oldest participants in the study. This raises the possibility of considering the impact of age on inertial heading perception in future research.
There was significant variation between subjects. Subject #17 was an outlier in that their bias towards visual stimulus was much larger than any other subjects (Fig 4). An argument could be made for excluding this subject as a statistical outlier, but we chose to leave him in. The subject was an 18-year-old white male, so he was not an outlier in terms of demographics. The sigma for this subject (Fig 5) was not an outlier. One possible interpretation of this subject’s response was that they did not rigorously follow the instructions and reported the direction of the visual stimulus instead of the inertial stimulus. However, this doesn’t fully explain the findings as the bias was still less than the offset, and the normal visual stimulus still had a larger bias than the inverted one.
Previous work in our lab has looked at the temporal alignment of visual and inertial stimuli [12] and found that for misalignments of 500 ms or longer, the visual heading direction no longer biases inertial heading perception. One way to think about the current study would be that the visual and inertial stimuli are 180° out of phase, or since a 2s stimulus was used, they are temporally shifted by 1s. However, this is different than the type of shifts used previously [12] since the visual and inertial stimuli begin and end at the same time. The fact that visual-inertial integration still occurs despite a 1s temporal offset in peak velocity and peak acceleration (Fig 1) implies factors other than when the peak velocity or peak acceleration occurs to determine the effect described with a timing delay. A future direction could be to independently vary the relative start and end point of these stimuli to gain further insight into what factors are important for this integration to occur.
The current experiments did not include any unisensory conditions (I,e. visual only or inertial only) although such experiments have been previously published in our laboratory [4,20]. We did not want to include a visual only condition because subjects were specifically asked to identify the inertial heading, and it would likely be confusing. Also, the reliability of the visual stimulus has shown to be strongly direction dependent with decreased reliability with more lateral headings [35] which would make a meaningful unisensory visual condition difficult to design and interpret. Although an inertial only condition could have quantified underlying biases, these were ultimately cancelled by averaging the effects of visual offsets in opposite directions.
It is well known that multisensory integration relies on the relative reliability of the stimuli [1,18,36–39]. The current findings could be potentially explained if the inverted visual stimulus was perceived as less reliable than the normal visual stimulus. Butler et al. found the raised cosine was more reliable than the constant velocity [11] although both were robustly integrated. In this study, the difference in the slope (sigma) of the psychometric function fits between the two types of stimuli was not statistically significant (paired t-test, p = 0.4). This suggests that the perceived reliability of the stimuli did not explain the observed larger bias in the normal visual stimulus.
In the current study, there were no attempts to control or measure gaze position. Gaze position was controlled and measured in previous studies from our group [4,20,35]. This was not done in the current study because it is difficult to maintain a gaze position without a fixation point, and when a fixation point is present subjects tend to judge the heading by reporting the focus expansion relative to the fixation point. In the current study, subjects were instructed to report only the direction of the inertial stimulus, which has previously been demonstrated to be independent of eye position [19,35]. Thus, the effect of gaze position in the current paradigm is unknown but likely minimal.
The size of the visual offset in this study didn’t make a difference in either the size of the bias or the sigma of the psychometric function fit. Multiple offsets were included because one hypothesis was that common causation might be perceived at a larger offset if the visual stimulus was consistent with the acceleration profile, but this was not borne out in the data. The smallest offset tested was 45° which was large compared with the bias, and this may be why the bias did not depend on the size of the offset. The bias may reach some maximal value, after which further visual offsets have no effect. We have also found that the reliability of the visual stimulus decreases as it gets more laterally displaced [35] so that larger displacements may have been counteracted by a lower perceived reliability. It is possible, if not likely, that smaller offsets would have produced a smaller bias, but this is outside of the scope of the current experiments.
It may be advantageous to integrate visual and inertial signals when they are not in agreement. Once linear motion reaches a constant velocity there is no acceleration and beyond vibration that may occur as a result there is motion without a corresponding inertial perception. Because this only occurs at a constant velocity, it may explain the earlier finding that a constant velocity visual stimulus has the same influence as a velocity match stimulus [11]. The current study demonstrates that visual-inertial integration is negatively impacted once the visual stimulus includes acceleration components that are incongruent with those of the inertial stimulus. This effect was statistically significant (p < 0.0001), thus, invalidating the last two hypotheses predicting a small or independent influence of visual motion profile.
Supporting information
S1 File. VisAccel complete data_redacted2.xlsx
https://doi.org/10.1371/journal.pone.0323348.s001
(XLSX)
Acknowledgments
The authors would like to thank Cesar Arduino for providing technical assistance for the experiments.
References
1. 1. Fetsch CR, Deangelis GC, Angelaki DE. Visual-vestibular cue integration for heading perception: applications of optimal cue integration theory. Eur J Neurosci. 2010;31(10):1721–9. pmid:20584175
* View Article
* PubMed/NCBI
* Google Scholar
2. 2. Fetsch CR, DeAngelis GC, Angelaki DE. Bridging the gap between theories of sensory cue integration and the physiology of multisensory neurons. Nat Rev Neurosci. 2013;14(6):429–42. pmid:23686172
* View Article
* PubMed/NCBI
* Google Scholar
3. 3. Gu Y, Fetsch CR, Adeyemo B, Deangelis GC, Angelaki DE. Decoding of MSTd population activity accounts for variations in the precision of heading perception. Neuron. 2010;66(4):596–609. pmid:20510863
* View Article
* PubMed/NCBI
* Google Scholar
4. 4. Crane BT. Effect of eye position during human visual-vestibular integration of heading perception. J Neurophysiol. 2017;118(3):1609–21. pmid:28615328
* View Article
* PubMed/NCBI
* Google Scholar
5. 5. Rodriguez R, Crane BT. Common causation and offset effects in human visual-inertial heading direction integration. J Neurophysiol. 2020;123(4):1369–79. pmid:32130052
* View Article
* PubMed/NCBI
* Google Scholar
6. 6. Stein BE, Stanford TR. Multisensory integration: current issues from the perspective of the single neuron. Nat Rev Neurosci. 2008;9(4):255–66. pmid:18354398
* View Article
* PubMed/NCBI
* Google Scholar
7. 7. Ernst MO, Banks MS. Humans integrate visual and haptic information in a statistically optimal fashion. Nature. 2002;415(6870):429–33. pmid:11807554
* View Article
* PubMed/NCBI
* Google Scholar
8. 8. Morein-Zamir S, Soto-Faraco S, Kingstone A. Auditory capture of vision: examining temporal ventriloquism. Brain Res Cogn Brain Res. 2003;17(1):154–63. pmid:12763201
* View Article
* PubMed/NCBI
* Google Scholar
9. 9. Pöppel E, Schill K, von Steinbüchel N. Sensory integration within temporally neutral systems states: a hypothesis. Naturwissenschaften. 1990;77(2):89–91. pmid:2314478
* View Article
* PubMed/NCBI
* Google Scholar
10. 10. Zampini M, Guest S, Shore DI, Spence C. Audio-visual simultaneity judgments. Percept Psychophys. 2005;67(3):531–44. pmid:16119399
* View Article
* PubMed/NCBI
* Google Scholar
11. 11. Butler JS, Campos JL, Bülthoff HH. Optimal visual-vestibular integration under conditions of conflicting intersensory motion profiles. Exp Brain Res. 2015;233(2):587–97. pmid:25361642
* View Article
* PubMed/NCBI
* Google Scholar
12. 12. Rodriguez R, Crane BT. Effect of timing delay between visual and vestibular stimuli on heading perception. J Neurophysiol. 2021;126(1):304–12. pmid:34191637
* View Article
* PubMed/NCBI
* Google Scholar
13. 13. Palmisano S, Allison RS, Pekin F. Accelerating self-motion displays produce more compelling vection in depth. Perception. 2008;37(1):22–33. pmid:18399245
* View Article
* PubMed/NCBI
* Google Scholar
14. 14. Palmisano S, Pinniger GJ, Ash A, Steele JR. Effects of simulated viewpoint jitter on visually induced postural sway. Perception. 2009;38(3):442–53. pmid:19485137
* View Article
* PubMed/NCBI
* Google Scholar
15. 15. Shayman CS, Seo J-H, Oh Y, Lewis RF, Peterka RJ, Hullar TE. Relationship between vestibular sensitivity and multisensory temporal integration. J Neurophysiol. 2018;120(4):1572–7. pmid:30020839
* View Article
* PubMed/NCBI
* Google Scholar
16. 16. de Winkel KN, Weesie J, Werkhoven PJ, Groen EL. Integration of visual and inertial cues in perceived heading of self-motion. J Vis. 2010;10(12):1. pmid:21047733
* View Article
* PubMed/NCBI
* Google Scholar
17. 17. de Winkel KN, Katliar M, Bülthoff HH. Causal inference in multisensory heading estimation. PLoS One. 2017;12(1):e0169676. pmid:28060957
* View Article
* PubMed/NCBI
* Google Scholar
18. 18. Acerbi L, Dokka K, Angelaki DE, Ma WJ. Bayesian comparison of explicit and implicit causal inference strategies in multisensory heading perception. PLoS Comput Biol. 2018;14(7):e1006110. pmid:30052625
* View Article
* PubMed/NCBI
* Google Scholar
19. 19. Cuturi LF, MacNeilage PR. Systematic biases in human heading estimation. PLoS One. 2013;8(2):e56862. pmid:23457631
* View Article
* PubMed/NCBI
* Google Scholar
20. 20. Rodriguez R, Crane BT. Effect of vibration during visual-inertial integration on human heading perception during eccentric gaze. PLoS One. 2018;13(6):e0199097. pmid:29902253
* View Article
* PubMed/NCBI
* Google Scholar
21. 21. Rodriguez R, Crane BT. Effect of range of heading differences on human visual-inertial heading estimation. Exp Brain Res. 2019;237(5):1227–37. pmid:30847539
* View Article
* PubMed/NCBI
* Google Scholar
22. 22. Roditi RE, Crane BT. Suprathreshold asymmetries in human motion perception. Exp Brain Res. 2012;219(3):369–79. pmid:22562587
* View Article
* PubMed/NCBI
* Google Scholar
23. 23. Bermúdez Rey MC, Clark TK, Wang W, Leeder T, Bian Y, Merfeld DM. Vestibular perceptual thresholds increase above the age of 40. Front Neurol. 2016;7:162. pmid:27752252
* View Article
* PubMed/NCBI
* Google Scholar
24. 24. Roditi RE, Crane BT. Directional asymmetries and age effects in human self-motion perception. J Assoc Res Otolaryngol. 2012;13(3):381–401. pmid:22402987
* View Article
* PubMed/NCBI
* Google Scholar
25. 25. MacNeilage PR, Banks MS, Berger DR, Bülthoff HH. A Bayesian model of the disambiguation of gravitoinertial force by visual cues. Exp Brain Res. 2007;179(2):263–90. pmid:17136526
* View Article
* PubMed/NCBI
* Google Scholar
26. 26. Prsa M, Gale S, Blanke O. Self-motion leads to mandatory cue fusion across sensory modalities. J Neurophysiol. 2012;108(8):2282–91. pmid:22832567
* View Article
* PubMed/NCBI
* Google Scholar
27. 27. Frissen I, Campos JL, Souman JL, Ernst MO. Integration of vestibular and proprioceptive signals for spatial updating. Exp Brain Res. 2011;212(2):163–76. pmid:21590262
* View Article
* PubMed/NCBI
* Google Scholar
28. 28. Seno T, Kitaoka A, Palmisano S. Vection induced by illusory motion in a stationary image. Perception. 2013;42(9):1001–5. pmid:24386721
* View Article
* PubMed/NCBI
* Google Scholar
29. 29. Ohmi M, Howard IP. Effect of stationary objects on illusory forward self-motion induced by a looming display. Perception. 1988;17(1):5–11. pmid:3205670
* View Article
* PubMed/NCBI
* Google Scholar
30. 30. de Winkel KN, Kurtz M, Bülthoff HH. Effects of visual stimulus characteristics and individual differences in heading estimation. J Vis. 2018;18(11):9. pmid:30347100
* View Article
* PubMed/NCBI
* Google Scholar
31. 31. Carpenter-Smith TR, Futamura RG, Parker DE. Inertial acceleration as a measure of linear vection: an alternative to magnitude estimation. Percept Psychophys. 1995;57(1):35–42. pmid:7885806
* View Article
* PubMed/NCBI
* Google Scholar
32. 32. Bubka A, Bonato F, Palmisano S. Expanding and contracting optic-flow patterns and vection. Perception. 2008;37(5):704–11. pmid:18605144
* View Article
* PubMed/NCBI
* Google Scholar
33. 33. Hettinger LJ, Berbaum KS, Kennedy RS, Dunlap WP, Nolan MD. Vection and simulator sickness. Mil Psychol. 1990;2(3):171–81. pmid:11537522
* View Article
* PubMed/NCBI
* Google Scholar
34. 34. Webb NA, Griffin MJ. Optokinetic stimuli: motion sickness, visual acuity, and eye movements. Aviat Space Environ Med. 2002;73(4):351–8. pmid:11952055
* View Article
* PubMed/NCBI
* Google Scholar
35. 35. Crane BT. Direction specific biases in human visual and vestibular heading perception. PLoS One. 2012;7(12):e51383. pmid:23236490
* View Article
* PubMed/NCBI
* Google Scholar
36. 36. Butler JS, Smith ST, Campos JL, Bülthoff HH. Bayesian integration of visual and vestibular signals for heading. J Vis. 2010;10(11):23. pmid:20884518
* View Article
* PubMed/NCBI
* Google Scholar
37. 37. Knill DC, Pouget A. The Bayesian brain: the role of uncertainty in neural coding and computation. Trends Neurosci. 2004;27(12):712–9. pmid:15541511
* View Article
* PubMed/NCBI
* Google Scholar
38. 38. Angelaki DE, Gu Y, DeAngelis GC. Multisensory integration: psychophysics, neurophysiology, and computation. Curr Opin Neurobiol. 2009;19(4):452–8. pmid:19616425
* View Article
* PubMed/NCBI
* Google Scholar
39. 39. Fetsch CR, Pouget A, DeAngelis GC, Angelaki DE. Neural correlates of reliability-based cue weighting during multisensory integration. Nat Neurosci. 2011;15(1):146–54. pmid:22101645
* View Article
* PubMed/NCBI
* Google Scholar
Citation: Yakouma MA, Anson E, Crane BT (2025) Effect of inverted visual acceleration profile on vestibular heading perception. PLoS One 20(5): e0323348. https://doi.org/10.1371/journal.pone.0323348
About the Authors:
Miguel A. Yakouma
Roles: Formal analysis, Investigation, Methodology, Writing – review & editing
Affiliations: Department of Biomedical Engineering, University of Rochester, Rochester, New York, United States of America, Department of Otolaryngology, University of Rochester, Rochester, New York, United States of America
Eric Anson
Roles: Conceptualization, Methodology, Writing – review & editing
Affiliations: Department of Otolaryngology, University of Rochester, Rochester, New York, United States of America, Department of Neuroscience, University of Rochester, Rochester, New York, United States of America
Benjamin T. Crane
Roles: Conceptualization, Data curation, Formal analysis, Funding acquisition, Investigation, Methodology, Project administration, Resources, Software, Supervision, Visualization, Writing – original draft, Writing – review & editing
E-mail: [email protected]
Affiliations: Department of Biomedical Engineering, University of Rochester, Rochester, New York, United States of America, Department of Otolaryngology, University of Rochester, Rochester, New York, United States of America, Department of Neuroscience, University of Rochester, Rochester, New York, United States of America
ORICD: https://orcid.org/0000-0002-8327-1257
1. Fetsch CR, Deangelis GC, Angelaki DE. Visual-vestibular cue integration for heading perception: applications of optimal cue integration theory. Eur J Neurosci. 2010;31(10):1721–9. pmid:20584175
2. Fetsch CR, DeAngelis GC, Angelaki DE. Bridging the gap between theories of sensory cue integration and the physiology of multisensory neurons. Nat Rev Neurosci. 2013;14(6):429–42. pmid:23686172
3. Gu Y, Fetsch CR, Adeyemo B, Deangelis GC, Angelaki DE. Decoding of MSTd population activity accounts for variations in the precision of heading perception. Neuron. 2010;66(4):596–609. pmid:20510863
4. Crane BT. Effect of eye position during human visual-vestibular integration of heading perception. J Neurophysiol. 2017;118(3):1609–21. pmid:28615328
5. Rodriguez R, Crane BT. Common causation and offset effects in human visual-inertial heading direction integration. J Neurophysiol. 2020;123(4):1369–79. pmid:32130052
6. Stein BE, Stanford TR. Multisensory integration: current issues from the perspective of the single neuron. Nat Rev Neurosci. 2008;9(4):255–66. pmid:18354398
7. Ernst MO, Banks MS. Humans integrate visual and haptic information in a statistically optimal fashion. Nature. 2002;415(6870):429–33. pmid:11807554
8. Morein-Zamir S, Soto-Faraco S, Kingstone A. Auditory capture of vision: examining temporal ventriloquism. Brain Res Cogn Brain Res. 2003;17(1):154–63. pmid:12763201
9. Pöppel E, Schill K, von Steinbüchel N. Sensory integration within temporally neutral systems states: a hypothesis. Naturwissenschaften. 1990;77(2):89–91. pmid:2314478
10. Zampini M, Guest S, Shore DI, Spence C. Audio-visual simultaneity judgments. Percept Psychophys. 2005;67(3):531–44. pmid:16119399
11. Butler JS, Campos JL, Bülthoff HH. Optimal visual-vestibular integration under conditions of conflicting intersensory motion profiles. Exp Brain Res. 2015;233(2):587–97. pmid:25361642
12. Rodriguez R, Crane BT. Effect of timing delay between visual and vestibular stimuli on heading perception. J Neurophysiol. 2021;126(1):304–12. pmid:34191637
13. Palmisano S, Allison RS, Pekin F. Accelerating self-motion displays produce more compelling vection in depth. Perception. 2008;37(1):22–33. pmid:18399245
14. Palmisano S, Pinniger GJ, Ash A, Steele JR. Effects of simulated viewpoint jitter on visually induced postural sway. Perception. 2009;38(3):442–53. pmid:19485137
15. Shayman CS, Seo J-H, Oh Y, Lewis RF, Peterka RJ, Hullar TE. Relationship between vestibular sensitivity and multisensory temporal integration. J Neurophysiol. 2018;120(4):1572–7. pmid:30020839
16. de Winkel KN, Weesie J, Werkhoven PJ, Groen EL. Integration of visual and inertial cues in perceived heading of self-motion. J Vis. 2010;10(12):1. pmid:21047733
17. de Winkel KN, Katliar M, Bülthoff HH. Causal inference in multisensory heading estimation. PLoS One. 2017;12(1):e0169676. pmid:28060957
18. Acerbi L, Dokka K, Angelaki DE, Ma WJ. Bayesian comparison of explicit and implicit causal inference strategies in multisensory heading perception. PLoS Comput Biol. 2018;14(7):e1006110. pmid:30052625
19. Cuturi LF, MacNeilage PR. Systematic biases in human heading estimation. PLoS One. 2013;8(2):e56862. pmid:23457631
20. Rodriguez R, Crane BT. Effect of vibration during visual-inertial integration on human heading perception during eccentric gaze. PLoS One. 2018;13(6):e0199097. pmid:29902253
21. Rodriguez R, Crane BT. Effect of range of heading differences on human visual-inertial heading estimation. Exp Brain Res. 2019;237(5):1227–37. pmid:30847539
22. Roditi RE, Crane BT. Suprathreshold asymmetries in human motion perception. Exp Brain Res. 2012;219(3):369–79. pmid:22562587
23. Bermúdez Rey MC, Clark TK, Wang W, Leeder T, Bian Y, Merfeld DM. Vestibular perceptual thresholds increase above the age of 40. Front Neurol. 2016;7:162. pmid:27752252
24. Roditi RE, Crane BT. Directional asymmetries and age effects in human self-motion perception. J Assoc Res Otolaryngol. 2012;13(3):381–401. pmid:22402987
25. MacNeilage PR, Banks MS, Berger DR, Bülthoff HH. A Bayesian model of the disambiguation of gravitoinertial force by visual cues. Exp Brain Res. 2007;179(2):263–90. pmid:17136526
26. Prsa M, Gale S, Blanke O. Self-motion leads to mandatory cue fusion across sensory modalities. J Neurophysiol. 2012;108(8):2282–91. pmid:22832567
27. Frissen I, Campos JL, Souman JL, Ernst MO. Integration of vestibular and proprioceptive signals for spatial updating. Exp Brain Res. 2011;212(2):163–76. pmid:21590262
28. Seno T, Kitaoka A, Palmisano S. Vection induced by illusory motion in a stationary image. Perception. 2013;42(9):1001–5. pmid:24386721
29. Ohmi M, Howard IP. Effect of stationary objects on illusory forward self-motion induced by a looming display. Perception. 1988;17(1):5–11. pmid:3205670
30. de Winkel KN, Kurtz M, Bülthoff HH. Effects of visual stimulus characteristics and individual differences in heading estimation. J Vis. 2018;18(11):9. pmid:30347100
31. Carpenter-Smith TR, Futamura RG, Parker DE. Inertial acceleration as a measure of linear vection: an alternative to magnitude estimation. Percept Psychophys. 1995;57(1):35–42. pmid:7885806
32. Bubka A, Bonato F, Palmisano S. Expanding and contracting optic-flow patterns and vection. Perception. 2008;37(5):704–11. pmid:18605144
33. Hettinger LJ, Berbaum KS, Kennedy RS, Dunlap WP, Nolan MD. Vection and simulator sickness. Mil Psychol. 1990;2(3):171–81. pmid:11537522
34. Webb NA, Griffin MJ. Optokinetic stimuli: motion sickness, visual acuity, and eye movements. Aviat Space Environ Med. 2002;73(4):351–8. pmid:11952055
35. Crane BT. Direction specific biases in human visual and vestibular heading perception. PLoS One. 2012;7(12):e51383. pmid:23236490
36. Butler JS, Smith ST, Campos JL, Bülthoff HH. Bayesian integration of visual and vestibular signals for heading. J Vis. 2010;10(11):23. pmid:20884518
37. Knill DC, Pouget A. The Bayesian brain: the role of uncertainty in neural coding and computation. Trends Neurosci. 2004;27(12):712–9. pmid:15541511
38. Angelaki DE, Gu Y, DeAngelis GC. Multisensory integration: psychophysics, neurophysiology, and computation. Curr Opin Neurobiol. 2009;19(4):452–8. pmid:19616425
39. Fetsch CR, Pouget A, DeAngelis GC, Angelaki DE. Neural correlates of reliability-based cue weighting during multisensory integration. Nat Neurosci. 2011;15(1):146–54. pmid:22101645
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2025 Yakouma et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
Visual motion is ambiguous in that it can either represent object motion or self-motion. Visual-vestibular integration is most advantageous during self-motion. The current experiment tests the hypothesis that the visual motion needs to have a motion profile consistent with the inertial motion. To test this, we examined the effect on heading perception when the visual stimulus was consistent with the inertial motion compared to an inverted visual stimulus, which was thus inconsistent with inertial motion. Twenty healthy human subjects (mean age 20 ± 3 years, 13 female) experienced 2s of translation, which they reported as left or right. A synchronized 2s visual heading was offset by 0°, ± 45°, ± 60°, or ±75°. In randomly interleaved trials, the visual motion was consistent with the inertial motion or inverted – it started at the peak velocity, decreased to zero mid-stimulus, and then accelerated back to the peak velocity at the end. When the velocity profile of the visual stimulus matched the velocity profile of inertial motion, the inertial stimulus was biased 10.0 ± 1.8° (mean ± SE) with a 45° visual offset, 8.9 ± 1.7° with a 60° offset, and 9.3° ± 2.5 ± with a 75° offset. When the visual stimulus was inverted so it was inconsistent with the inertial motion, the respective biases were 6.5 ± 1.5°, 5.6 ± 1.7°, and 5.9 ± 2.0°. The biases with the inverted stimulus were significantly smaller (p < 0.0001), demonstrating that the visual motion profile is considered in multisensory integration rather than simple trajectory endpoints.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer