The study evaluated the degree of comparability of human behaviour exhibited during an evacuation scenario conducted in two environments, one real and the other virtual. For this purpose, we created a precise 3D model (digital twin) of a real campus building. In both the experimental (virtual environment) and control (real environment) groups, the participants were all first-time visitors to the building. The groups were instructed to locate a target room where they would subsequently be required to complete a simple task, at which point an evacuation alarm would be set off. We evaluated three types of data collected during their quest to find a way out of the building: an eye-tracker logged gaze hits on various objects (e.g., navigation signs), recorded the locomotion and trajectory of the participants, and logged a combination of these parameters (e.g., points along the path where participants sighted certain objects). The sample was composed of 73 participants (35 for the real environment, 38 for the virtual environment). We analysed and confirmed the results and concluded that despite certain aspects of human behaviour differing during the simulated evacuation, the locomotion behaviour exhibited by the participants in both environments was generally comparable. To improve the potential use of these findings in practice, we selected easily accessible hardware and excluded expensive devices such as treadmills, etc.