In this report, we present the project URWalking conducted at the University of Regensburg. We describe its major outcomes: Firstly, an indoor navigation system for pedestrians as a web application and as an Android app with position tracking of users in indoor and outdoor environments. Our implementation showcases that a variant of the $$A^*$$
A
∗
-algorithm by Ullmann (tengetriebene optimierung präferenzadaptiver fußwegrouten durch gebäudekomplexe https://epub.uni-regensburg.de/43697/, 2020) can handle the routing problem in large, levelled indoor environments efficiently. Secondly, the apps have been used in several studies for a deeper understanding of human wayfinding. We collected eye tracking and synchronized video data, think aloud protocols, and log data of users interacting with the apps. We applied state-of-the-art deep learning models for gaze tracking and automatic classification of landmarks. Our results indicate that even the most recent version of the YOLO image classifier by Redmon and Farhadi (olov3: An incremental improvement. arXiv, 2018) needs finetuning to recognize everyday objects in indoor environments. Furthermore, we provide empirical evidence that appropriate machine learning models are helpful to bridge behavioural data from users during wayfinding and conceptual models for the salience of objects and landmarks. However, simplistic models are insufficient to reasonably explain wayfinding behaviour in real time—an open issue in GeoAI. We conclude that the GeoAI community should collect more naturalistic log data of wayfinding activities in order to build efficient machine learning models capable of predicting user reactions to routing instructions and of explaining how humans integrate stimuli from the environment as essential information into routing instructions while solving wayfinding tasks. Such models form the basis for real-time wayfinding assistance.