The hand explores the environment for obtaining tactile information that can be fruitfully integrated with other functions, such as vision, audition, and movement. In theory, somatosensory signals gathered by the hand are accurately mapped in the world-centered (allocentric) reference frame such that the multi-modal information signals, whether visual-tactile or motor-tactile, are perfectly aligned. However, an accumulating body of evidence indicates that the perceived tactile orientation or direction is inaccurate; yielding a surprisingly large perceptual bias. To investigate such perceptual bias, this study presented tactile motion stimuli to healthy adult participants in a variety of finger and head postures, and requested the participants to report the perceived direction of motion mapped on a video screen placed on the frontoparallel plane in front of the eyes. Experimental results showed that the perceptual bias could be divided into systematic and nonsystematic biases. Systematic bias, defined as the mean difference between the perceived and veridical directions, correlated linearly with the relative posture between the finger and the head. By contrast, nonsystematic bias, defined as minor difference in bias for different stimulus directions, was highly individualized, phase-locked to stimulus orientation presented on the skin. Overall, the present findings on systematic bias indicate that the transformation bias among the reference frames is dominated by the finger-to-head posture. Moreover, the highly individualized nature of nonsystematic bias reflects how information is obtained by the orientationselective units in the S1 cortex. A hallmark of hand function is to manipulate objects and acquire tactile information, a process denoted as haptics. Furthermore, human manipulation of objects in a series of haptic process requires an integrated neural representation of the body (body schema) and of the space around the body 1-5. When touching an object with hands, we perceive tactile motion 6 that is crucial toward determining the direction 7-11 and speed 8,12,13 of the object, as well as for planning subsequent movements 14. While the tactile motion perceived by the skin of hands is encoded in the somatotopic (skin-centered) reference frame 15,16 , the physical movement of the object is actually represented in the allocentric (or external world-centered) reference frame 17,18. Misalignment between the reference frames frequently causes a bias in perceiving tactile motion 11,19. Therefore, tactile remapping is the outcome of multimodality integration that has been shown to be affected by finger postures 20 , body postures 21 , and transformation of eye-and body-centered reference frames 22,23 , indicating a large-scale transformation and integration involving multiple reference frames 24-27. For example, perceived visual 28 and tactile 29 motion directions can be affected by hand and arm postures 30,31. Studies on temporal integration suggested that multimodal information was processed by a recurrent scheme a...