2009
DOI: 10.1523/jneurosci.5437-08.2009
|View full text |Cite
|
Sign up to set email alerts
|

Natural, Metaphoric, and Linguistic Auditory Direction Signals Have Distinct Influences on Visual Motion Processing

Abstract: To interact with our dynamic environment, the brain merges motion information from auditory and visual senses. However, not only "natural" auditory MOTION, but also "metaphoric" de/ascending PITCH and SPEECH (e.g., "left/right"), influence the visual motion percept. Here, we systematically investigate whether these three classes of direction signals influence visual motion perception through shared or distinct neural mechanisms. In a visual-selective attention paradigm, subjects discriminated the direction of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

15
94
0
1

Year Published

2010
2010
2024
2024

Publication Types

Select...
7
1
1

Relationship

0
9

Authors

Journals

citations
Cited by 111 publications
(110 citation statements)
references
References 64 publications
15
94
0
1
Order By: Relevance
“…Another possibility is that understanding motion language recruits higher-level convergence areas that process visual motion, resulting in a bias to see dot motion in the same direction. Previous work has shown such a congruence effect in that hearing the words "right" and "left" biased participants to see an apparent motion stimulus as moving in the same direction (20). fMRI data revealed that this audiovisual interaction was driven more by activity in the anterior intraparietal sulcus than in hMT+.…”
mentioning
confidence: 78%
“…Another possibility is that understanding motion language recruits higher-level convergence areas that process visual motion, resulting in a bias to see dot motion in the same direction. Previous work has shown such a congruence effect in that hearing the words "right" and "left" biased participants to see an apparent motion stimulus as moving in the same direction (20). fMRI data revealed that this audiovisual interaction was driven more by activity in the anterior intraparietal sulcus than in hMT+.…”
mentioning
confidence: 78%
“…Neuroimaging studies and transcranial magnetic stimulation (TMS) might help in future research to determine whether all of these correspondences are indeed on a par, neurologically speaking (see Bien et al, 2012;Sadaghiani et al, 2009;Spence & Parise, 2012).…”
Section: Discussionmentioning
confidence: 99%
“…There is also evidence that statistical crossmodal correspondences can modulate the neural response relatively early during information processing (that is, 220 ms after stimulus onset, see Bien, ten Oever, Goebel, & Sack, 2012;Spence & Parise, 2012;see Spence & Deroy, 2013, for a discussion). Such results should, of course, not be taken to imply that statistical correspondences do not also (or sometimes only) activate other loci later in information processing, as well (Sadaghiani, Maier, & Noppeney, 2009).…”
Section: Underevidenced Behavioral Effectsmentioning
confidence: 97%
“…It is possible that crossmodal correspondences may differ in terms of their neural origin and, accordingly, manifest at different levels of the cognitive system (see also Sadaghiani, Maier, & Noppeney, 2009). Spence (2011) suggested that crossmodal correspondences can be classified into at three distinct types: structural, which most likely reflect direct correspondences in the neural processing of sensory information; statistical, which reflect crossmodal associations between sensory features or dimensions that exist in nature (e.g., the fact that small objects make higher-pitched sounds) and are, most likely, simply learned; and semantic, which apply when two dimensions overlap in the meaning (or associations) of the stimuli.…”
Section: Discussionmentioning
confidence: 99%