2010
DOI: 10.1167/10.12.1
|View full text |Cite
|
Sign up to set email alerts
|

Integration of visual and inertial cues in perceived heading of self-motion

Abstract: In the present study, we investigated whether the perception of heading of linear self-motion can be explained by Maximum Likelihood Integration (MLI) of visual and non-visual sensory cues. MLI predicts smaller variance for multisensory judgments compared to unisensory judgments. Nine participants were exposed to visual, inertial, or visual-inertial motion conditions in a moving base simulator, capable of accelerating along a horizontal linear track with variable heading. Visual random-dot motion stimuli were … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

11
59
0

Year Published

2010
2010
2014
2014

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 65 publications
(70 citation statements)
references
References 40 publications
11
59
0
Order By: Relevance
“…Since at least some studies caution that the way visual cues are presented can disrupt optimal integration (Butler et al 2011;de Winkel et al 2010), it was important to utilize a natural visual scene and manipulate thresholds by changing the motion frequency. This contrasts with prior approaches in which one cue is artificially degraded to alter its precision, as in some self-motion perception studies (Butler et al 2010;Fetsch et al 2009Fetsch et al , 2012Gu et al 2008) and other multisensory integration studies (e.g., Ernst and Banks 2002;Landy et al 1995).…”
Section: Dynamics Of Vestibular and Visual Perceptual Thresholdsmentioning
confidence: 99%
“…Since at least some studies caution that the way visual cues are presented can disrupt optimal integration (Butler et al 2011;de Winkel et al 2010), it was important to utilize a natural visual scene and manipulate thresholds by changing the motion frequency. This contrasts with prior approaches in which one cue is artificially degraded to alter its precision, as in some self-motion perception studies (Butler et al 2010;Fetsch et al 2009Fetsch et al , 2012Gu et al 2008) and other multisensory integration studies (e.g., Ernst and Banks 2002;Landy et al 1995).…”
Section: Dynamics Of Vestibular and Visual Perceptual Thresholdsmentioning
confidence: 99%
“…Virtual environments can be more complex than natural environments with respect to their scale [4,2], structural complexity and dimensionality [5]. Furthermore, virtual environments often differ from natural environments with respect to the sensory modalities involved, depending on the use of visual, auditory and vestibular displays [6,7,8,9,10,11] and interaction methods [8]. Thus, an optimal support of human spatial navigation in virtual environments relies on knowing the effects of involving multiple sensory modalities.…”
Section: Introductionmentioning
confidence: 99%
“…Psychophysical studies of heading discrimination using two-alternative forcedchoice tasks have reported vestibular heading discrimination thresholds in darkness that are as small as a few degrees Fetsch et al 2009;Butler et al 2010Butler et al , 2015de Winkel et al 2010;Drugowitsch et al 2014). Such threshold values are comparable (although larger) with those described in visual heading discrimi-nation tasks (Warren and Hannon 1990;Royden et al 1992;van den Berg and Brenner 1994;Stone and Perrone 1997).…”
Section: Multisensory Cues For Heading Perceptionmentioning
confidence: 82%