We present PACE, a Personalized, Automatically Calibrating Eye-tracking system that identifies and collects data unobtrusively from user interaction events on standard computing systems without the need for specialized equipment. PACE relies on eye/facial analysis of webcam data based on a set of robust geometric gaze features and a two-layer data validation mechanism to identify good training samples from daily interaction data. The design of the system is founded on an in-depth investigation of the relationship between gaze patterns and interaction cues, and takes into consideration user preferences and habits. The result is an adaptive, data-driven approach that continuously recalibrates, adapts and improves with additional use.Quantitative evaluation on 31 subjects across different interaction behaviors shows that training instances identified by the PACE data collection have higher gaze point-interaction cue consistency than those identified by conventional approaches. An in-situ study using real-life tasks on a diverse set of interactive applications demonstrates that the PACE gaze estimation achieves an average error of 2.56º , which is comparable to state-of-theart, but without the need for explicit training or calibration. This demonstrates the effectiveness of both the gaze estimation method and the corresponding data collection mechanism.
Most eye gaze estimation systems rely on explicit calibration, which is inconvenient to the user, limits the amount of possible training data and consequently the performance. Since there is likely a strong correlation between gaze and interaction cues, such as cursor and caret locations, a supervised learning algorithm can learn the complex mapping between gaze features and the gaze point by training on incremental data collected implicitly from normal computer interactions. We develop a set of robust geometric gaze features and a corresponding data validation mechanism that identifies good training data from noisy interaction-informed data collected in real-use scenarios. Based on a study of gaze movement patterns, we apply behavior-informed validation to extract gaze features that correspond with the interaction cue, and data-driven validation provides another level of crosschecking using previous good data. Experimental evaluation shows that the proposed method achieves an average error of 4.06º , and demonstrates the effectiveness of the proposed gaze estimation method and corresponding validation mechanism.
We present a vision-based, data-driven approach to identifying and measuring refractive errors in human subjects with low-cost, easily available equipment and no specialist training. Vision problems, such as refractive error (e.g. nearsightedness, astigmatism, etc) are common ocular problems, which, if uncorrected, may lead to serious visual impairment. The diagnosis of such defects conventionally requires expensive specialist equipment and trained personnel, which is a barrier in many parts of the developing world. Our approach aims to democratize optometric care by utilizing the computational power inherent in consumer-grade devices and the advances made possible by multimedia computing. We present results that show our system is able to match and outperform state-of-the-art medical devices under certain conditions.
Affect exchange is essential for healthy physical and social development [7], and friends and family communicate their emotions to each other instinctively. In particular, watching movies has always been a popular mode of socialization and video sharing is increasingly viewed as an effective way to facilitate communication of feelings and affects, even when the parties are not in the same location. We present an asynchronous video-sharing platform that uses Emotars to facilitate affect sharing in order to create and enhance the sense of togetherness through the experience of asynchronous movie watching. We investigate its potential impact and benefits, including a better viewing experience, supporting relationships, and strengthening engagement, connectedness and emotion awareness among individuals.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
customersupport@researchsolutions.com
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.
Copyright © 2025 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.