Immersive virtual reality (VR) technology has become a popular method for fundamental and applied spatial cognition research. One challenge researchers face is emulating walking in a large-scale virtual space although the user is in fact in a small physical space. To address this, a variety of movement interfaces in VR have been proposed, from traditional joysticks to teleportation and omnidirectional treadmills. These movement methods tap into different mental processes of spatial learning during navigation, but their impacts on distance perception remain unclear. In this paper, we investigated the role of visual display, proprioception, and optic flow on distance perception in a large-scale building by manipulating four different movement methods. Eighty participants either walked in a real building, or moved through its virtual replica using one of three movement methods: VR-treadmill, VR-touchpad, and VR-teleportation. Results revealed that, first, visual display played a major role in both perceived and traversed distance estimates but did not impact environmental distance estimates. Second, proprioception and optic flow did not impact the overall accuracy of distance perception, but having only an intermittent optic flow (in the VR-teleportation movement method) impaired the precision of traversed distance estimates. In conclusion, movement method plays a significant role in distance perception but does not impact the configurational knowledge learned in a large-scale real and virtual building, and the VR-touchpad movement method provides an effective interface for navigation in VR.