The authors present an approach to the coordination of eye movements and locomotion in naturalistic steering tasks. It is based on recent empirical research, in particular, on driver eye movements, that poses challenges for existing accounts of how we visually steer a course. They first analyze how the ideas of feedback and feedforward processes and internal models are treated in control theoretical steering models within vision science and engineering, which share an underlying architecture but have historically developed in very separate ways. The authors then show how these traditions can be naturally (re)integrated with each other and with contemporary neuroscience, to better understand the skill and gaze strategies involved. They then propose a conceptual model that (a) gives a unified account to the coordination of gaze and steering control, (b) incorporates higher-level path planning, and (c) draws on the literature on paired forward and inverse models in predictive control. Although each of these (a-c) has been considered before (also in the context of driving), integrating them into a single framework and the authors' multiple waypoint identification hypothesis within that framework are novel. The proposed hypothesis is relevant to all forms of visually guided locomotion. (PsycINFO Database Record
Objective: To present a structured, narrative review highlighting research into human perceptual-motor coordination that can be applied to automated vehicle (AV)–human transitions. Background: Manual control of vehicles is made possible by the coordination of perceptual-motor behaviors (gaze and steering actions), where active feedback loops enable drivers to respond rapidly to ever-changing environments. AVs will change the nature of driving to periods of monitoring followed by the human driver taking over manual control. The impact of this change is currently poorly understood. Method: We outline an explanatory framework for understanding control transitions based on models of human steering control. This framework can be summarized as a perceptual-motor loop that requires (a) calibration and (b) gaze and steering coordination. A review of the current experimental literature on transitions is presented in the light of this framework. Results: The success of transitions are often measured using reaction times, however, the perceptual-motor mechanisms underpinning steering quality remain relatively unexplored. Conclusion: Modeling the coordination of gaze and steering and the calibration of perceptual-motor control will be crucial to ensure safe and successful transitions out of automated driving. Application: This conclusion poses a challenge for future research on AV-human transitions. Future studies need to provide an understanding of human behavior that will be sufficient to capture the essential characteristics of drivers reengaging control of their vehicle. The proposed framework can provide a guide for investigating specific components of human control of steering and potential routes to improving manual control recovery.
A major unresolved question in understanding visually guided locomotion in humans is whether actions are driven solely by the immediately available optical information (model-free online control mechanisms), or whether internal models have a role in anticipating the future path. We designed two experiments to investigate this issue, measuring spontaneous gaze behaviour while steering, and predictive gaze behaviour when future path information was withheld. In Experiment 1 participants (N = 15) steered along a winding path with rich optic flow: gaze patterns were consistent with tracking waypoints on the future path 1–3 s ahead. In Experiment 2, participants (N = 12) followed a path presented only in the form of visual waypoints located on an otherwise featureless ground plane. New waypoints appeared periodically every 0.75 s and predictably 2 s ahead, except in 25% of the cases the waypoint at the expected location was not displayed. In these cases, there were always other visible waypoints for the participant to fixate, yet participants continued to make saccades to the empty, but predictable, waypoint locations (in line with internal models of the future path guiding gaze fixations). This would not be expected based upon existing model-free online steering control models, and strongly points to a need for models of steering control to include mechanisms for predictive gaze control that support anticipatory path following behaviours.
How do animals follow demarcated paths? Different species are sensitive to optic flow and one control solution is to maintain the balance of flow symmetry across visual fields; however, it is unclear whether animals are sensitive to changes in asymmetries when steering along curved paths. Flow asymmetries can alter the global properties of flow (i.e. flow speed) which may also influence steering control. We tested humans steering curved paths in a virtual environment. The scene was manipulated so that the ground plane to either side of the demarcated path produced larger or smaller asymmetries in optic flow. Independent of asymmetries and the locomotor speed, the scene properties were altered to produce either faster or slower globally averaged flow speeds. Results showed that rather than being influenced by changes in flow asymmetry, steering responded to global flow speed. We conclude that the human brain performs global averaging of flow speed from across the scene and uses this signal as an input for steering control. This finding is surprising since the demarcated path provided sufficient information to steer, whereas global flow speed (by itself) did not. To explain these findings, existing models of steering must be modified to include a new perceptual variable: namely global optic flow speed.
When negotiating bends car drivers perform gaze polling: their gaze shifts between guiding fixations (GFs; gaze directed 1–2 s ahead) and look-ahead fixations (LAFs; longer time headway). How might this behavior change in autonomous vehicles where the need for constant active visual guidance is removed? In this driving simulator study, we analyzed this gaze behavior both when the driver was in charge of steering or when steering was delegated to automation, separately for bend approach (straight line) and the entry of the bend (turn), and at various speeds. The analysis of gaze distributions relative to bend sections and driving conditions indicate that visual anticipation (through LAFs) is most prominent before entering the bend. Passive driving increased the proportion of LAFs with a concomitant decrease of GFs, and increased the gaze polling frequency. Gaze polling frequency also increased at higher speeds, in particular during the bend approach when steering was not performed. LAFs encompassed a wide range of eccentricities. To account for this heterogeneity two sub-categories serving distinct information requirements are proposed: mid-eccentricity LAFs could be more useful for anticipatory planning of steering actions, and far-eccentricity LAFs for monitoring potential hazards. The results support the idea that gaze and steering coordination may be strongly impacted in autonomous vehicles.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.