Current interpretations of extinction suggest that the disorder is due to an unbalanced competition between ipsilesional and contralesional representations of space. The question addressed in this study is whether the competition between left and right representations of space in one sensory modality (i.e., touch) can be reduced or exacerbated by the activation of an intact spatial representation in a different modality that is functionally linked to the damaged representation (i.e., vision). This hypothesis was tested in 10 right-hemisphere lesioned patients who suffered from reliable tactile extinction. We found that a visual stimulus presented near the patient"s ipsilesional hand (i.e., visual peripersonal space) inhibited the processing of a tactile stimulus delivered on the contralesional hand (cross-modal visuotactile extinction) to the same extent as did an ipsilesional tactile stimulation (unimodal tactile extinction). It was also found that a visual stimulus presented near the contralesional hand improved the detection of a tactile stimulus applied to the same hand. In striking contrast, less modulatory effects of vision on touch perception were observed when a visual stimulus was presented far from the space immediately around the patient"s hand (i.e., extrapersonal space). This study clearly demonstrates the existence of a visual peripersonal space centered on the hand in humans and its modulatory effects on tactile perception. These findings are explained by referring to the activity of bimodal neurons in premotor and parietal cortex of macaque, which have tactile receptive fields on the hand and corresponding visual receptive fields in the space immediately adjacent to the tactile fields.
Previous findings have demonstrated the existence of a visual peripersonal space centered on the hand in humans and its modulatory effects on tactile perception. A strong modulatory effect of vision on touch perception was found when a visual stimulus was presented near the hand. In contrast, when the visual stimulus was presented far from the hand, only a weak modulatory effect was found. The aim of the present study was to verify whether such cross-modal links between touch and vision in the peripersonal space centered on the hand could be mediated by proprioceptive signals specifying the current hand positions or if they directly reflect an interaction between two sensory modalities, i.e., vision and touch. To this aim, cross-modal effects were studied in two different experiments: one in which patients could see their hands and one in which vision of their hands was prevented. The results showed strong modulatory effects of vision on touch perception when the visual stimulus was presented near the seen hand and only mild effects when the vision of the hand was prevented. These findings are explained by referring to the activity of bimodal neurons in premotor and parietal cortex of macaque, which have tactile receptive fields on the hand, and corresponding visual receptive fields in the space immediately adjacent to the tactile fields. One important feature of these bimodal neurons is that their responsiveness to visual stimuli delivered near the body part is reduced or even extinguished when the view of the body part is prevented. This implies that, at least for the hand, the vision of the hand is crucial for determining the spatial mapping between vision and touch that takes place in the peripersonal space. In contrast, the proprioceptive signals specifying the current hand position in space do not seem to be relevant in determining the cross-modal interaction between vision and touch.
A convergent series of studies in monkeys and man suggests that the computation of visual space is performed in several brain regions for different behavioural purposes. Among these multiple spatial areas, the ventral intraparietal cortex, the putamen and the ventral aspect of the premotor cortex (area 6) contain a system for representing visual space near the face (peripersonal space). In these cerebral areas some neurons are bimodal: they have tactile receptive fields on the face, and they can also be driven by visual stimuli located near the tactile field. The spatial correspondence between the visual and tactile receptive fields provides a map of near visual space coded in body-part-centred co-ordinates. In the present study we demonstrate for the first time the existence of a visual peripersonal space centred on the face in humans. In patients with right hemispheric lesions, visual stimuli delivered in the space near the ipsilesional side of the face extinguished tactile stimuli on the contralesional side (cross-modal visuotactile extinction) to the same extent as did an ipsilesional tactile stimulation (unimodal tactile extinction). Furthermore, a visual stimulus presented in the proximity of the contralesional side of the face improved the detection of a left tactile stimulus: i.e. under bilateral tactile presentation patients were more accurate to report the presence of a left tactile stimulus when a simultaneous visual stimulus was presented near the left side of the face. However, when visual stimuli were delivered far from the face, visuotactile extinction and visuotactile facilitation effects were dramatically reduced. These findings are consistent with the hypothesis of a representation of visual peripersonal space coded in bodypart-centred co-ordinates, and they provide a striking demonstration of the modularity of human visual space.
The aim of the present study was to assess the relationship between overt and covert orienting of attention in visual neglect patients with parietal and fronto-parietal lesions. Two stimuli were presented at eccentricities of 8° or 20° to the left (LVF) or right (RVF) visual fields and the patient was required to maintain fixation on the central mark and to respond only manually upon the appearance of the stimulus. Neglect patients with fronto-parietal lesion showed a lack of oculomotor control and the presence of leftward eye movements without corresponding attentional shifts. Neglect patients with parietal lesions did not show this phenomenon. They rarely responded ocularly and manually to LVF stimuli, whereas they were unable to inhibit an automatic ocular orienting reaction towards RVF stimuli. When a RVF stimuli triggered both ocular and attentional shifts, the pattern of responses revealed a retinal eccentricity effect. Patients were more accurate to respond to stimuli located at 8° than 20°. In contrast, when a RVF stimuli triggered only attentional shifts, the results showed the attentional gradient effect (Iiidavas, 1990). Patients were more accu- rate to respond to stimuli located at 20° than 8°. Therefore, the results of the present study seem to suggest a functional dissociation of the mechanisms subserving attentional and gaze orienting and a differential role played by the frontal and parietal lobes in overt visual orienting.
No abstract
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.