Proceedings of the 24th Australian Computer-Human Interaction Conference 2012
DOI: 10.1145/2414536.2414614
|View full text |Cite
|
Sign up to set email alerts
|

Gaze tracking and non-touch gesture based interaction method for mobile 3D virtual spaces

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
17
0

Year Published

2014
2014
2019
2019

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 19 publications
(17 citation statements)
references
References 18 publications
0
17
0
Order By: Relevance
“…Pouke et al investigated the combination of gaze and mid-air gestures, but in the form of a 6DOF sensor device attached to the hand [13]. Their system supported tilt, grab/switch, shake and throw gestures.…”
Section: Gaze and Mid-air Gesturesmentioning
confidence: 99%
See 1 more Smart Citation
“…Pouke et al investigated the combination of gaze and mid-air gestures, but in the form of a 6DOF sensor device attached to the hand [13]. Their system supported tilt, grab/switch, shake and throw gestures.…”
Section: Gaze and Mid-air Gesturesmentioning
confidence: 99%
“…Hales et al describe a system in which discrete hand gestures issued commands to objects in the environment selected by gaze [12]. Pouke et al investigated the combination of gaze and mid-air gestures, but in the form of a 6DOF sensor device attached to the hand [13]. They compared their technique with touch, and found that the touch-based interaction was faster and more accurate.…”
Section: Introductionmentioning
confidence: 99%
“…Finally, the user experiences much less fatigue moving their eyes than their hands or other parts of body. Many authors have studied multi-modal control interfaces combining gaze tracking pointing and hand gesture control before [15,16,12,17]. Pouke et al [12] used an eye tracking interface for selecting a model and controlled it using wearable hardware on the hand.…”
Section: Our Approachmentioning
confidence: 99%
“…Many authors have studied multi-modal control interfaces combining gaze tracking pointing and hand gesture control before [15,16,12,17]. Pouke et al [12] used an eye tracking interface for selecting a model and controlled it using wearable hardware on the hand. But this system provided limited control functions for movement and rotation that are insufficient to apply to CAD applications.…”
Section: Our Approachmentioning
confidence: 99%
See 1 more Smart Citation