We present a novel approach for accurate eye tracking as required, e.g., in VR/AR/MR headsets. Our method exploits the retrieved surface normals and dense 3D features extracted from deflectometry measurements to estimate the gazing direction.
Eye tracking is an important tool with a wide range of applications in Virtual, Augmented, and Mixed Reality (VR/AR/MR) technologies. State-of-the-art eye tracking methods are either reflection-based and track reflections of sparse point light sources, or image-based and exploit 2D features of the acquired eye image. In this work, we attempt to significantly improve reflection-based methods by utilizing pixel-dense deflectometric surface measurements in combination with optimization-based inverse rendering algorithms. Utilizing the known geometry of our deflectometric setup, we develop a differentiable rendering pipeline based on PyTorch3D that simulates a virtual eye under screen illumination. Eventually, we exploit the imagescreen-correspondence information from the captured measurements to find the eye's rotation, translation, and shape parameters with our renderer via gradient descent. In general, our method does not require a specific pattern and can work with ordinary video frames of the main VR/AR/MR screen itself. We demonstrate real-world experiments with evaluated mean relative gaze errors below 0.45 • at a precision better than 0.11 • . Moreover, we show an improvement of 6X over a representative reflection-based state-of-the-art method in simulation.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.