Abstract. We present a user study assessing spatial transfer in a 3D navigation task, with two different motor activities: a minimal (joystick) and an extensive motor activity (walking Interface), with rotations of the viewpoint either controlled by the user, or automatically managed by the system. The task consisted in learning a virtual path of a 3D model of a real city, with either one of these four conditions: Joystick / Treadmill Vs Manual Rotation / Automatic Rotation. We assessed spatial knowledge with six spatial restitution tasks. To assess the interfaces used, we analyzed also the interaction data acquired during the learning path. Our results show that the direct control of rotations has different effects, depending on the motor activity required by the input modality. The quality of spatial representation increases with the Treadmill when rotations are enabled. With the Joystick, controlling the rotations affect spatial representations. We discuss our findings in terms of cognitive, sensorimotor processes and human computer interaction issues.