Naturally gaze is used for visual perception of our environment and gaze movements are mainly controlled subconsciously. Forcing the user to consciously diverge from that natural gaze behavior for interaction purposes causes high cognitive workload and destroys information contained in natural gaze movements. Instead of proposing a new gaze-based interaction technique, we analyze natural gaze behavior during an object manipulation task and show ways how it can be used for intention recognition, which provides a universal basis for integrating gaze into multimodal interfaces for different applications. We propose a model for multimodal integration of natural gaze behavior and evaluate it for two different use cases, namely for improvement of robustness of other potentially noisy input cues and for the design of proactive interaction techniques
In this contribution, we propose the use of eye tracking technology to support video analysts. To reduce workload, we implemented two new interaction techniques as a substitute for mouse pointing: gaze-based selection of a video of interest from a set of video streams, and gaze-based selection of moving targets in videos. First results show that the multi-modal interaction technique gaze + key press allows the selection of fast moving objects in a more effective way. Moreover, we discuss further application possibilities like gaze behavior analysis to measure the analyst's fatigue, or analysis of the gaze behavior of expert analysts to instruct novices
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.