2013
DOI: 10.1037/a0028416
|View full text |Cite
|
Sign up to set email alerts
|

Auditory, tactile, and audiotactile information processing following visual deprivation.

Abstract: We highlight the results of those studies that have investigated the plastic reorganization processes that occur within the human brain as a consequence of visual deprivation, as well as how these processes give rise to behaviorally observable changes in the perceptual processing of auditory and tactile information. We review the evidence showing that visual deprivation affects the establishment of the spatial coordinate systems involved in the processing of auditory and tactile inputs within the peripersonal … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
29
1
1

Year Published

2013
2013
2023
2023

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 37 publications
(34 citation statements)
references
References 293 publications
(584 reference statements)
3
29
1
1
Order By: Relevance
“…Very little is currently known regarding the effects of partial non-correctable visual loss on auditory distance perception, and perceptual processing in this population remains under-researched relative to that for those with total visual losses (Occelli, Spence, & Zampini, 2013 ). Kolarik et al( 2013b ) found no difference in auditory distance discrimination using level, DRR, or both cues between a partially sighted group and a normally sighted group, whereas enhanced performance was found for those with full visual loss.…”
Section: Concluding Remarks and Suggestions For Further Researchmentioning
confidence: 99%
“…Very little is currently known regarding the effects of partial non-correctable visual loss on auditory distance perception, and perceptual processing in this population remains under-researched relative to that for those with total visual losses (Occelli, Spence, & Zampini, 2013 ). Kolarik et al( 2013b ) found no difference in auditory distance discrimination using level, DRR, or both cues between a partially sighted group and a normally sighted group, whereas enhanced performance was found for those with full visual loss.…”
Section: Concluding Remarks and Suggestions For Further Researchmentioning
confidence: 99%
“…This demonstrates the operation of interpretative mechanisms in perception (section 1). It is also of great interest that congenitally blind participants were not as prone to the auditory-tactile illusion as sighted participants, suggesting that multisensory integration may differ in the congenitally blind, perhaps because they have better auditory discrimination abilities than sighted people or because vision plays a role in crossmodal calibration ( Gori et al, 2010 ; Occelli, Spence & Zampini, 2013 ). The McGurk effect is another example of a multimodal illusion where lip movements and heard speech interact ( McGurk & MacDonald, 1976 ).…”
Section: Key General Considerations For Sensory Substitutionmentioning
confidence: 99%
“…Due to their lack of vision, congenitally-blind people need to rely more on their remaining senses, such as audition, touch or proprioception. It has been shown that extensive use of remaining sensory modalities can result in superior performance of congenitally-blind people, compared to sighted people, in perceptual, spatial, and attentional tasks [43], [44]. Therefore, congenitally blind participants may have effectively used the available auditory and tactile information provided by the co-actor, i.e., they used an agent-based reference frame, in order to build up a more reliable spatial target representation than they would by using response-based coding alone.…”
Section: Discussionmentioning
confidence: 99%
“…There is evidence that congenitally-blind people compensate for the lack of visual input with their remaining senses, to some extent ( compensatory hypothesis , [43]). A more extensive use of these senses can result in superior perceptual and spatial skills, as well as more efficient attentional processes [43][44]. In contrast to the response buttons, a responding co-actor in the social Simon task can provide direct auditory and tactile feedback about his/her location in space, which offers a reliable spatial reference (e.g., by crossing participants' arms over one another) and, thus, may facilitate agent-based coding.…”
Section: Introductionmentioning
confidence: 99%