Head movement is widely used as a uniform type of input for human-computer interaction. However, there are fundamental differences between head movements coupled with gaze in support of our visual system, and head movements performed as gestural expression. Both Head-Gaze and Head Gestures are of utility for interaction but difer in their afordances. To facilitate the treatment of Head-Gaze and Head Gestures as separate types of input, we developed HeadBoost as a novel classifer, achieving high accuracy in classifying gaze-driven versus gestural head movement ( 1 -Score: 0.89). We demonstrate the utility of the classifer with three applications: gestural input while avoiding unintentional input by Head-Gaze; target selection with Head-Gaze while avoiding Midas Touch by head gestures; and switching of cursor control between Head-Gaze for fast positioning and Head Gesture for refnement. The classifcation of Head-Gaze and Head Gesture allows for seamless head-based interaction while avoiding false activation. CCS CONCEPTS• Human-centered computing → Gestural input; Virtual reality;
A hybrid gaze and brain-computer interface (BCI) was developed to accomplish target selection in a Fitts' law experiment. The method, GIMIS, uses gaze input to steer the computer cursor for target pointing and motor imagery (MI) via the BCI to execute a click for target selection. An experiment (n = 15) compared three motor imagery selection methods: using the left-hand only, using the legs, and using either the left-hand or legs. The latter selection method ("either") had the highest throughput (0.59 bps), the fastest selection time (2650 ms), and an error rate of 14.6%. Pupil size significantly increased with increased target width. We recommend the use of large targets, which significantly reduced error rate, and the "either" option for BCI selection, which significantly increased throughput. BCI selection is slower compared to dwell time selection, but if gaze control is deteriorating, for example in a late stage of the ALS disease, GIMIS may be a way to gradually introduce BCI. CCS CONCEPTS• Human-centered computing → HCI design and evaluation methods.
We present a working paper on integrating eye tracking with mixed and augmented reality for the benefit of low vision aids. We outline the current state of the art and relevant research and point to further research and development required in order to adapt to individual user, environment, and current task. We outline key technical challenges and possible solutions including calibration, dealing with variant eye data quality, measuring and adapting image processing to low vision within current technical limitations, and outline an experimental approach to designing data-driven solutions using machine learning and artificial intelligence. CCS CONCEPTS• Human-centered computing → Accessibility technologies; Mixed / augmented reality.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.