The ExtendedHand interface projects a computer graphics (CG) hand that synchronizes with a user's physical hand movements onto a real environment, visually extending the user's reach. This paper focuses on enhancing the user's tactile perception of an object through cross-modal phenomena by providing a sound texture (auditory information that matches the object) when the CG hand touches it. Here, ExtendedHand enables users to touch objects beyond their physical reach, an experience that cannot be achieved with their physical body. In such situations, the appropriateness of adjusting sound pressure based on physical laws according to distance for users is unclear. Additionally, we have empirical knowledge that the speed at which we touch objects with our hands results in different sounds. Within ExtendedHand, since the movement of the user's physical hand is amplified and reflected in the CG hand's movement, the physical hand's speed does not match the CG hand's speed. This raises the question of whether sound texture feedback should align with the visual information of the CG hand or the proprioceptive sensory information of the physical hand. In this paper, we conducted two user studies to explore appropriate sound texture feedback for the projected CG hand. The results indicate that when the CG hand touches objects at various distances, the sound pressure should follow the same sound pressure attenuation as observed in physical phenomena. Additionally, the results suggest that despite swift tracing actions with the CG hand, users perceive sounds produced at a slower pace to be more suitable.INDEX TERMS Augmented reality, body augmentation, sound texture feedback, tactile sensation.