It has been proposed that motor imagery contains an element of sensory experiences (kinesthetic sensations), which is a substitute for the sensory feedback that would normally arise from the overt action. No evidence has been provided about whether kinesthetic sensation is centrally simulated during motor imagery. We psychophysically tested whether motor imagery of palmar flexion or dorsiflexion of the right wrist would influence the sensation of illusory palmar flexion elicited by tendon vibration. We also tested whether motor imagery of wrist movement shared the same neural substrates involving the illusory sensation elicited by the peripheral stimuli. Regional cerebral blood flow was measured with H215O and positron emission tomography in 10 right-handed subjects. The right tendon of the wrist extensor was vibrated at 83 Hz ("illusion") or at 12.5 Hz with no illusion ("vibration"). Subjects imagined doing wrist movements of alternating palmar and dorsiflexion at the same speed with the experienced illusory movements ("imagery"). A "rest" condition with eyes closed was included. We identified common active fields between the contrasts of imagery versus rest and illusion versus vibration. Motor imagery of palmar flexion psychophysically enhanced the experienced illusory angles of plamar flexion, whereas dorsiflexion imagery reduced it in the absence of overt movement. Motor imagery and the illusory sensation commonly activated the contralateral cingulate motor areas, supplementary motor area, dorsal premotor cortex, and ipsilateral cerebellum. We conclude that kinesthetic sensation associated with imagined movement is internally simulated during motor imagery by recruiting multiple motor areas.
Face perception is critical for social communication. Given its fundamental importance in the course of evolution, the innate neural mechanisms can anticipate the computations necessary for representing faces. However, the effect of visual deprivation on the formation of neural mechanisms that underlie face perception is largely unknown. We previously showed that sighted individuals can recognize basic facial expressions by haptics surprisingly well. Moreover, the inferior frontal gyrus (IFG) and posterior superior temporal sulcus (pSTS) in the sighted subjects are involved in haptic and visual recognition of facial expressions. Here, we conducted both psychophysical and functional magnetic-resonance imaging (fMRI) experiments to determine the nature of the neural representation that subserves the recognition of basic facial expressions in early blind individuals. In a psychophysical experiment, both early blind and sighted subjects haptically identified basic facial expressions at levels well above chance. In the subsequent fMRI experiment, both groups haptically identified facial expressions and shoe types (control). The sighted subjects then completed the same task visually. Within brain regions activated by the visual and haptic identification of facial expressions (relative to that of shoes) in the sighted group, corresponding haptic identification in the early blind activated regions in the inferior frontal and middle temporal gyri. These results suggest that the neural system that underlies the recognition of basic facial expressions develops supramodally even in the absence of early visual experience.
Humans can judge grating orientation by touch. Previous studies indicate that the extrastriate cortex is involved in tactile orientation judgments, suggesting that this area is related to visual imagery. However, it has been unclear which neural mechanisms are crucial for the tactile processing of orientation, because visual imagery is not always required for tactile spatial tasks. We expect that such neural mechanisms involve multisensory areas, because our perception of space is highly integrated across modalities. The current study uses functional magnetic resonance imaging during the classification of grating orientations to evaluate the neural substrates responsible for the multisensory spatial processing of orientation. We hypothesized that a region within the intraparietal sulcus (IPS) would be engaged in orientation processing, regardless of the sensory modality. Sixteen human subjects classified the orientations of passively touched gratings and performed two control tasks with both the right and left hands. Tactile orientation classification activated regions around the right postcentral sulcus and IPS, regardless of the hand used, when contrasted with roughness classification of the same stimuli. Right-lateralized activation was confirmed in these regions by evaluating the hemispheric effects of tactile spatial processing with both hands. In contrast, visual orientation classification activated the left middle occipital gyrus when contrasted with color classification of the same stimuli. Furthermore, visual orientation classification activated a part of the right IPS that was also activated by the tactile orientation task. Thus, we suggest that a part of the right IPS is engaged in the multisensory spatial processing of grating orientation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.