Many studies have investigated the contributions of vision, touch, and proprioception to body ownership, i.e., the multisensory perception of limbs and body parts as our own. However, the computational processes and principles that determine subjectively experienced body ownership remain unclear. To address this issue, we developed a detection-like psychophysics task based on the classic rubber hand illusion paradigm where participants were asked to report whether the rubber hand felt like their own (the illusion) or not. We manipulated the asynchrony of visual and tactile stimuli delivered to the rubber hand and the hidden real hand under different levels of visual noise. We found that (1) the probability of the emergence of the rubber hand illusion increased with visual noise and was well predicted by a causal inference model involving the observer computing the probability of the visual and tactile signals coming from a common source; (2) the causal inference model outperformed a non-Bayesian model involving the observer not taking into account sensory uncertainty; (3) by comparing body ownership and visuotactile synchrony detection, we found that the prior probability of inferring a common cause for the two types of multisensory percept was correlated but greater for ownership, which suggests that individual differences in rubber hand illusion can be explained at the computational level as differences in how priors are used in the multisensory integration process. These results imply that the same statistical principles determine the perception of the bodily self and the external world.