Proceedings of the 23rd ACM Symposium on Virtual Reality Software and Technology 2017
DOI: 10.1145/3139131.3143420
|View full text |Cite
|
Sign up to set email alerts
|

Hand-free natural user interface for VR HMD with IR based facial gesture tracking sensor

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2

Relationship

2
6

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 3 publications
0
7
0
Order By: Relevance
“…By using these methods, it is possible to determine the user's eye gaze (i.e., where the user is looking at in the VE), enabling them to point or aim at virtual elements (e.g., buttons) without moving the head [8]. However, it is also possible to determine relative eye movements (e.g., the eyes turned to the right) and issue a "to right" command [108], and detect when users blink [62,113] or have their eyes closed [59] to confirm selections or control the VR systems. Additionally, using the eyes for interaction suffers from what is described as the "Midas Touch" problem [50].…”
Section: Interaction Interfacesmentioning
confidence: 99%
“…By using these methods, it is possible to determine the user's eye gaze (i.e., where the user is looking at in the VE), enabling them to point or aim at virtual elements (e.g., buttons) without moving the head [8]. However, it is also possible to determine relative eye movements (e.g., the eyes turned to the right) and issue a "to right" command [108], and detect when users blink [62,113] or have their eyes closed [59] to confirm selections or control the VR systems. Additionally, using the eyes for interaction suffers from what is described as the "Midas Touch" problem [50].…”
Section: Interaction Interfacesmentioning
confidence: 99%
“…Nickel et al found that “Langer ’s line” [32], which indicates the collagen fiber distribution in human skin, follows the direction of measured SRDR of skin [4], and Cha et al showed that the SRDR measured in relation to skin deformation mirrors deformation of the skin [5]. Cha et al and Kim et al applied IR diffusion reflectance to detect VR headset user’s facial expressions [33,34].…”
Section: Sensor Implementationmentioning
confidence: 99%
“…Especially, because AR/VR displays are placed near to eyes, it is impossible to touch the screen directly. Therefore, other input tools using various sensors such as leap motion sensors, electromyograph sensors, inertial measurement units, eye-trackers, IR facial gesture sensors, cameras, and axis-tilt sensors, have been employed [126][127][128][129][130][131][132][133][134].…”
Section: Introductionmentioning
confidence: 99%