Human visual 3D perception is flawed by distortions, which are influenced by non-visual factors, such as gravitational vestibular signals. Distinct hypotheses regarding the sensory processing stage at which gravity acts may explain the influence of gravity: 1) a direct effect on the visual system, 2) a shaping of the internal representation of space that is used to interpret sensory signals, or 3) a role in the ability to build multiple, modalityspecific, internal depictions of the perceived object. To test these hypotheses, we performed experiments comparing visual versus haptic 3D perception, and the effects of microgravity on these two senses. The results show that visual and haptic perceptual anisotropies reside in body-centered, and not gravity-centered, planes, suggesting an ego-centric encoding of the information for both sensory modalities. Although coplanar, the perceptual distortions of the two sensory modalities are in opposite directions: depth is visually underestimated, but haptically overestimated, with respect to height and width. Interestingly microgravity appears to amplify the 'terrestrial' distortions of both senses. Through computational modeling, we show that these findings are parsimoniously predicted only by a gravity facilitation of cross-modal sensory reconstructions, corresponding to Hypothesis 3. This theory is able to explain not only how gravity can shape egocentric perceptions, but also the unexpected opposite effect of gravity on visual and haptic 3D perception. Overall, these results suggest that the brain uses gravity as a stable reference cue to reconstruct concurrent, modality-specific internal representations of 3D objects even when they are sensed through only one sensory channel. Up-Down Longitudinal B C A