A neural model is developed to explain how humans can approach a goal object on foot while steering around obstacles to avoid collisions in a cluttered environment. The model uses optic flow from a 3D virtual reality environment to determine the position of objects based on motion discontinuities, and computes heading direction, or the direction of self-motion, from global optic flow. The cortical representation of heading interacts with the representations of a goal and obstacles such that the goal acts as an attractor of heading, while obstacles act as repellers. In addition the model maintains fixation on the goal object by generating smooth pursuit eye movements. Eye rotations can distort the optic flow field, complicating heading perception, and the model uses extraretinal signals to correct for this distortion and accurately represent heading. The model explains how motion processing mechanisms in cortical areas MT, MST, and VIP can be used to guide steering. The model quantitatively simulates human psychophysical data about visually-guided steering, obstacle avoidance, and route selection.Key Words: Heading Perception, Steering, Optic Flow, Obstacle, Goal, Pursuit Eye Movement, Gain Fields, Peak Shift, V2, MT, MST, VIP, LIP 3
IntroductionMany important steering tasks are guided by visual information, including walking through a cluttered environment (Fajen & Warren, 2003), driving (Land & Horwood, 1995;Hildreth, Beusmans, Boer, & Royden, 2000;Wallis, Chatziastros, & Bülthoff, 2002), vehicle braking (Lee, 1976, piloting an aircraft (Gibson, Olum, & Rosenblatt, 1955;Beall & Loomis, 1997), and intercepting a moving target on foot (Fajen & Warren, 2004). Steering through a cluttered environment involves the selection of a path that avoids obstacles and simultaneously approaches the intended goal. Human steering is guided by visual information about the spatial layout of the environment and the direction of self-motion through the environment, as well as proprioceptive feedback and information from other sensory systems. In particular, the visual system provides information about the relative positions of the goal object and obstacles in the environment.Movement through the world creates a full-field pattern of motion on the retina, called optic flow, which contains information about the direction of self-motion, or heading (Gibson, 1950). In principle, optic flow can be used to compute heading (Longuet-Higgins & Prazdny, 1980). During translational movement of the eye, the optic flow field contains a singularity, called the focus of expansion, which specifies the direction of heading in the absence of an eye rotation.Figures 1 & 2 Whereas optic flow relates observer motion to the available visual stimuli, recent research clarifies how visual information governs the dynamics of human navigational behavior. Fajen and Warren (2003) studied the dynamics of human steering behavior in simple goal approach and obstacle avoidance tasks using an immersive virtual reality system. They found that human performa...