Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems 2021
DOI: 10.1145/3411764.3445697
|View full text |Cite
|
Sign up to set email alerts
|

Radi-Eye: Hands-Free Radial Interfaces for 3D Interaction using Gaze-Activated Head-Crossing

Abstract: Figure 1: Radi-Eye in a smart home environment for control of appliances. A: The user turns on the lamp via a toggle selection with minimal effort using only gaze (orange) and head (red) movements. B: Selection can be expanded to subsequent headcontrolled continuous interaction to adjust the light colour via a slider. C: Gaze-triggered nested levels support a large number of widgets and easy selection of one of the multiple preset lighting modes. The widgets enabled via Radi-Eye allow a high-level of hands-fre… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
4

Relationship

2
6

Authors

Journals

citations
Cited by 28 publications
(13 citation statements)
references
References 49 publications
0
13
0
Order By: Relevance
“…To recognize head gestures or orientations, researchers have explored various sensing techniques such as motion sensing [14], acoustic sensing (Soundr [53]), capacitive sensing [16], vision-based sensing [23,35], and so on. Radi-Eye [40] is a hands-free radial interfaces for interaction in 3D space using both gaze and head crossing gestures. HeadCross [50] and HeadGesture [52] presented a hands-free interface using head movements for HMDs.…”
Section: Hands-free Gesture Inputmentioning
confidence: 99%
“…To recognize head gestures or orientations, researchers have explored various sensing techniques such as motion sensing [14], acoustic sensing (Soundr [53]), capacitive sensing [16], vision-based sensing [23,35], and so on. Radi-Eye [40] is a hands-free radial interfaces for interaction in 3D space using both gaze and head crossing gestures. HeadCross [50] and HeadGesture [52] presented a hands-free interface using head movements for HMDs.…”
Section: Hands-free Gesture Inputmentioning
confidence: 99%
“…We found the idea transferable to eye and hand interfaces, as we are not normally looking at our hands during visual exploration, but naturally do so when we manipulate a target [43]. In Look&Cross, eye and head are aligned for selection in a radial menu [45]. As in Look&Cross we are adopting gaze for pre-selection in menus, but we use manual target crossing for confirmation instead of crossing by head pointer.…”
Section: Alignment Of Input From Separate Pointing Modalitiesmentioning
confidence: 99%
“…Head-gaze interaction is one of the most common approaches, which has been widely studied for target selection (e.g., [2,11,17,21]) and text entry (e.g., [35,37,41]) in VR/AR HMD. As head movements can be performed accurately yet effectively, head-gaze interaction has been used as a standalone approach and combined with or supported by other modalities, such as eyebased interaction [2,11,15,21,26,27,35]. Head-gaze has also been used, for example, to interact in games [1], map interfaces [8], and to construct new interaction methods [6,26,27,38].…”
Section: Head-based Interactionmentioning
confidence: 99%
“…As head movements can be performed accurately yet effectively, head-gaze interaction has been used as a standalone approach and combined with or supported by other modalities, such as eyebased interaction [2,11,15,21,26,27,35]. Head-gaze has also been used, for example, to interact in games [1], map interfaces [8], and to construct new interaction methods [6,26,27,38].…”
Section: Head-based Interactionmentioning
confidence: 99%