“…Besides the growing use in behavioral research, eye tracking in VR can enable a variety of different use cases (Duchowski, 2002(Duchowski, , 2017Plopski et al, 2022). To highlight just some examples: Approaches such as foveated rendering allow higher visual fidelity and reduce rendering demands and power consumption for VR graphics (Albert et al, 2017;Patney et al, 2016), gaze-based pointing and target selection can be utilized to create intuitive and multimodal methods of interaction (Jacob & Stellmach, 2016;Majaranta & Bulling, 2014;Plopski et al, 2022;Tanriverdi & Jacob, 2000), and knowledge about a user's current gaze direction can enable novel ways to experience and imperceptibly manipulate a virtual environment (e.g., Langbehn et al, 2018;Marwecki et al, 2019). All of these applications have different requirements relating to the quality of eye tracking data, such as high spatial accuracy and precision of the estimated gaze position in the case of gaze selection and interaction (Feit et al, 2017;Orquin & Holmqvist, 2018;Schuetz et al, 2019Schuetz et al, , 2020, or very low latency between performing an eye movement and the corresponding change in a visual scene for foveated rendering (Albert et al, 2017;Stein et al, 2021).…”