The hand explores the environment for obtaining tactile information that can be fruitfully integrated with other functions, such as vision, audition, and movement. In theory, somatosensory signals gathered by the hand are accurately mapped in the world-centered (allocentric) reference frame such that the multi-modal information signals, whether visual-tactile or motor-tactile, are perfectly aligned. However, an accumulating body of evidence indicates that the perceived tactile orientation or direction is inaccurate; yielding a surprisingly large perceptual bias. To investigate such perceptual bias, this study presented tactile motion stimuli to healthy adult participants in a variety of finger and head postures, and requested the participants to report the perceived direction of motion mapped on a video screen placed on the frontoparallel plane in front of the eyes. Experimental results showed that the perceptual bias could be divided into systematic and nonsystematic biases. Systematic bias, defined as the mean difference between the perceived and veridical directions, correlated linearly with the relative posture between the finger and the head. By contrast, nonsystematic bias, defined as minor difference in bias for different stimulus directions, was highly individualized, phase-locked to stimulus orientation presented on the skin. Overall, the present findings on systematic bias indicate that the transformation bias among the reference frames is dominated by the finger-to-head posture. Moreover, the highly individualized nature of nonsystematic bias reflects how information is obtained by the orientationselective units in the S1 cortex. A hallmark of hand function is to manipulate objects and acquire tactile information, a process denoted as haptics. Furthermore, human manipulation of objects in a series of haptic process requires an integrated neural representation of the body (body schema) and of the space around the body 1-5. When touching an object with hands, we perceive tactile motion 6 that is crucial toward determining the direction 7-11 and speed 8,12,13 of the object, as well as for planning subsequent movements 14. While the tactile motion perceived by the skin of hands is encoded in the somatotopic (skin-centered) reference frame 15,16 , the physical movement of the object is actually represented in the allocentric (or external world-centered) reference frame 17,18. Misalignment between the reference frames frequently causes a bias in perceiving tactile motion 11,19. Therefore, tactile remapping is the outcome of multimodality integration that has been shown to be affected by finger postures 20 , body postures 21 , and transformation of eye-and body-centered reference frames 22,23 , indicating a large-scale transformation and integration involving multiple reference frames 24-27. For example, perceived visual 28 and tactile 29 motion directions can be affected by hand and arm postures 30,31. Studies on temporal integration suggested that multimodal information was processed by a recurrent scheme a...
Information obtained from multiple sensory modalities, such as vision and touch, is integrated to yield a holistic percept. As a haptic approach usually involves cross-modal sensory experiences, it is necessary to develop an apparatus that can characterize how a biological system integrates visual-tactile sensory information as well as how a robotic device infers object information emanating from both vision and touch. In the present study, we develop a novel visual-tactile cross-modal integration stimulator that consists of an LED panel to present visual stimuli and a tactile stimulator with three degrees of freedom that can present tactile motion stimuli with arbitrary motion direction, speed, and indentation depth in the skin. The apparatus can present cross-modal stimuli in which the spatial locations of visual and tactile stimulations are perfectly aligned. We presented visual-tactile stimuli in which the visual and tactile directions were either congruent or incongruent, and human observers reported the perceived visual direction of motion. Results showed that perceived direction of visual motion can be biased by the direction of tactile motion when visual signals are weakened. The results also showed that the visual-tactile motion integration follows the rule of temporal congruency of multi-modal inputs, a fundamental property known for cross-modal integration.
An amendment to this paper has been published and can be accessed via a link at the top of the paper.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.