2013
DOI: 10.1111/ejn.12176
|View full text |Cite
|
Sign up to set email alerts
|

The influence of static eye and head position on the ventriloquist effect

Abstract: Orienting responses to audiovisual events have shorter reaction times and better accuracy and precision when images and sounds in the environment are aligned in space and time. How the brain constructs an integrated audiovisual percept is a computational puzzle because the auditory and visual senses are represented in different reference frames: the retina encodes visual locations with respect to the eyes; whereas the sound localisation cues are referenced to the head. In the well-known ventriloquist effect, t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
13
0

Year Published

2015
2015
2019
2019

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 13 publications
(15 citation statements)
references
References 108 publications
2
13
0
Order By: Relevance
“…First, in line with several laboratory studies of multisensory integration using simple sensory stimuli (e.g. white noise bursts and LED flashes) [16][17][18][19][20][21][22][23]25,26], a lower auditory SNR typically induced stronger multisensory enhancement. However, here we report that for the lowest SNRs (-21 dB) the enhancement saturated, or even slightly dropped (Fig.…”
Section: Inverse Effectivenesssupporting
confidence: 59%
See 1 more Smart Citation
“…First, in line with several laboratory studies of multisensory integration using simple sensory stimuli (e.g. white noise bursts and LED flashes) [16][17][18][19][20][21][22][23]25,26], a lower auditory SNR typically induced stronger multisensory enhancement. However, here we report that for the lowest SNRs (-21 dB) the enhancement saturated, or even slightly dropped (Fig.…”
Section: Inverse Effectivenesssupporting
confidence: 59%
“…speech-reading/lipreading) has a positive impact on speech perception, and audiovisual speech recognition in acoustic noise is substantially better than for auditory speech alone [4][5][6][7][8][9][10][11][12][13][14][15]. Audiovisual integration in general, has been the topic of a variety of behavioral and electrophysiological studies, involving rapid eyeorienting to simple peripheral stimuli [16,17], spatial and temporal discrimination of audiovisual objects [18][19][20], and the integrative responses of single neurons in cats and monkeys [21][22][23]. Three main principles have been shown to govern the mechanisms of multisensory integration: i. spatial alignment of the different sources, ii.…”
Section: Introductionmentioning
confidence: 99%
“…Subjects had to avoid any head movements. Eye movements were not controlled for, since they have been shown to have no effect on the ventriloquism effect (Bertelson et al 2000 ; van Barneveld and van Wanrooij 2013 ). The time interval between the fixation light and stimulus onset varied randomly between 0.5 and 1 s.…”
Section: Methodsmentioning
confidence: 99%
“…Head-movement signals were analyzed in MatLab (version 2014b) using custom-written software (Van Barneveld and Van Wanrooij, 2013). For each stimulus and participant, we determined the best-fit linear regression line between stimulus and response location, respectively α T and α R :…”
Section: Localizationmentioning
confidence: 99%