2014
DOI: 10.1016/j.neuroimage.2014.02.017
|View full text |Cite
|
Sign up to set email alerts
|

Supramodal processing optimizes visual perceptual learning and plasticity

Abstract: Multisensory interactions are ubiquitous in cortex and it has been suggested that sensory cortices may be supramodal i.e. capable of functional selectivity irrespective of the sensory modality of inputs (Pascual-Leone and Hamilton, 2001; Renier et al., 2013; Ricciardi and Pietrini, 2011; Voss and Zatorre, 2012). Here, we asked whether learning to discriminate visual coherence could benefit from supramodal processing. To this end, three groups of participants were briefly trained to discriminate which of a red … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

13
76
1
2

Year Published

2014
2014
2021
2021

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 35 publications
(92 citation statements)
references
References 110 publications
(118 reference statements)
13
76
1
2
Order By: Relevance
“…For example, sensory-guided plasticity using auditory or vibrotactile perception has been suggested as a possible approach to enhancing perceptual learning with a visual prosthesis (Merabet et al, 2005; Proulx et al, 2014). This suggestion is consistent with findings from psychophysical training experiments that show better visual perception after training with concordant acoustic patterns (Shams et al, 2011; van Wassenhove, 2013; Zilber et al, 2014). Such results encourage the view that multisensory stimuli are consistently useful in promoting unisensory perceptual learning.…”
Section: Introductionsupporting
confidence: 90%
See 1 more Smart Citation
“…For example, sensory-guided plasticity using auditory or vibrotactile perception has been suggested as a possible approach to enhancing perceptual learning with a visual prosthesis (Merabet et al, 2005; Proulx et al, 2014). This suggestion is consistent with findings from psychophysical training experiments that show better visual perception after training with concordant acoustic patterns (Shams et al, 2011; van Wassenhove, 2013; Zilber et al, 2014). Such results encourage the view that multisensory stimuli are consistently useful in promoting unisensory perceptual learning.…”
Section: Introductionsupporting
confidence: 90%
“…The literature reports examples of auditory stimuli promoting visual perceptual learning with non-speech stimuli (Shams and Seitz, 2008; Shams et al, 2011; Zilber et al, 2014). But our recent study of prelingually deaf adults with late-acquired cochlear implants showed that visual speech impeded auditory perceptual learning (Bernstein et al, 2014), while the same AV training did not impede and even promoted to some extent the auditory perceptual learning of adults with normal hearing.…”
Section: Experiments 2: Lipreading Training With Va Vs Vo Stimulimentioning
confidence: 99%
“…However, we also outlined that these findings may results from attention effects, as originally observed in macaques [80]. To further investigate such issues and disentangle attention from operative effects in the recourse to high frequencies, future work will be devoted to the analysis of another existing MEG dataset [84] for which complementary eye tracker recordings will permit to probe attention through measurements of ocular saccades in cunjonction with behavioral performance.…”
Section: Discussionmentioning
confidence: 99%
“…The time window from 100 to 600 ms was used (126 time-points). The ROIs were delineated on each participant (for more detail [13]) for both right and left hemispheres (Fig.3), except the frontalpole region which is a label from Freesurfer parcellation [14]. The dimensions of the data became: n = 196 at most, and p = 126 * 9000 ∼ 10 6 at most depending on the size of the label.…”
Section: B Source Spacementioning
confidence: 99%
“…Similarly, mSTS showed a better discriminative power below the perceptual threshold. The regions of interest defined on basis of prior analysis [13] nicely show specificity for decoding. On the contrary, a control label (here, the frontal pole, black curve) barely reflects the perceptual threshold.…”
Section: B Source Spacementioning
confidence: 99%