Although motor actions can profoundly affect the perceptual interpretation of sensory inputs, it is not known whether the combination of sensory and movement signals occurs only for sensory surfaces undergoing movement or whether it is a more general phenomenon. In the haptic modality, the independent movement of multiple sensory surfaces poses a challenge to the nervous system when combining the tactile and kinesthetic signals into a coherent percept. When exploring a stationary object, the tactile and kinesthetic signals come from the same hand. Here we probe the internal structure of haptic combination by directing the two signal streams to separate hands: one hand moves but receives no tactile stimulation, while the other hand feels the consequences of the first hand's movement but remains still. We find that both discrete and continuous tactile and kinesthetic signals are combined as if they came from the same hand. This combination proceeds by direct coupling or transfer of the kinesthetic signal from the moving to the feeling hand, rather than assuming the displacement of a mediating object. The combination of signals is due to perception rather than inference, because a small temporal offset between the signals significantly degrades performance. These results suggest that the brain simplifies the complex coordinate transformation task of remapping sensory inputs to take into account the movements of multiple body parts in haptic perception, and they show that the effects of action are not limited to moving sensors.touch | haptics | perception | sensorimotor integration | kinesthesis M otor and kinesthetic signals arising from the movement of the eyes in the head, and translation of the eyes and ears in space due to head and body movements, have been shown to play an important role in visual (1-3) and auditory (4-6) perception. However, because of the small number of sensory surfaces in these modalities, and the rigid constraints on their movement, the number of kinesthetic degrees of freedom is limited. In active touch or the haptic modality (7-13), the large number of sensory surfaces, and the nearly unlimited ways these surfaces can move, lead to the question of how movement can be represented and associated with the cutaneous or tactile signals. To study how tactile and kinesthetic cues are combined to haptically perceive object shape and size, we created a novel haptic stimulus in which these cues were completely dissociated. This stimulus consisted of simulated triangles felt through a narrow slit, as in anorthoscopic perception in vision (14-16) or haptic perception (17), and as illustrated in Fig.