Humans routinely use both of their hands to gather information about shape and texture of objects. Yet, the mechanisms of how the brain combines haptic information from the two hands to achieve a unified percept are unclear. This study systematically measured the haptic precision of humans exploring a virtual curved object contour with one or both hands to understand if the brain integrates haptic information from the two hemispheres. Bayesian perception theory predicts that redundant information from both hands should improve haptic estimates. Thus exploring an object with two hands should yield haptic precision that is superior to unimanual exploration. A bimanual robotic manipulandum passively moved the hands of 20 blindfolded, right-handed adult participants along virtual curved contours. Subjects indicated which contour was more "curved" (forced choice) between two stimuli of different curvature. Contours were explored uni-or bimanually at two orientations (toward or away from the body midline). Respective psychophysical discrimination thresholds were computed. First, subjects showed a tendency for one hand to be more sensitive than the other with most of the subjects exhibiting a left-hand bias. Second, bimanual thresholds were mostly within the range of the corresponding unimanual thresholds and were not predicted by a maximum-likelihood estimation (MLE) model. Third, bimanual curvature perception tended to be biased toward the motorically dominant hand, not toward the haptically more sensitive left hand. Two-handed exploration did not necessarily improve haptic sensitivity. We found no evidence that haptic information from both hands is integrated using a MLE mechanism. Rather, results are indicative of a process of "sensory selection", where information from the dominant right hand is used, although the left, nondominant hand may yield more precise haptic estimates.handedness; human; sensorimotor; sensory integration HUMANS ROUTINELY USE THEIR hands to haptically explore objects in the environment. In many cases, both hands are used to gain information about the properties of the object. We know that haptic sensing requires the integration of spatially disparate sensory signals from cutaneous afferents in the digits with proprioceptive signals of the arm (a process of intersensory integration). However, we know very little about how the brain combines the haptic information of our two hands to achieve a single percept of an object and about the underlying mechanism of how the nervous system integrates or fuses information from two haptic systems.To investigate issues of haptic sensing in a controlled experimental setting, virtual force environments have been used to present haptic stimuli (Chib et al. 2006;Fasse et al. 2000;Henriques and Soechting 2005;Hogan et al. 1990). In these studies, subjects usually grasped the handle or stylus of a robotically controlled manipulandum, which generated appropriate boundary forces resembling the surfaces of virtual objects. The advantage of this technique is that o...