2021
DOI: 10.1145/3461732
|View full text |Cite
|
Sign up to set email alerts
|

M[eye]cro

Abstract: We present M[eye]cro an interaction technique to select on-screen objects and navigate menus through the synergistic use of eye-gaze and thumb-to-finger microgestures. Thumb-to-finger microgestures are gestures performed with the thumb of a hand onto the fingers of the same hand. The active body of research on microgestures highlights expected properties including speed, availability and eye-free interaction. Such properties make microgestures a good candidate for multitasking. However, while praised, the stat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
3
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
5
2
1

Relationship

3
5

Authors

Journals

citations
Cited by 14 publications
(13 citation statements)
references
References 58 publications
0
3
0
Order By: Relevance
“…to improve gesture recognition accuracy [4,37,46,54]. However, these approaches primarily target input methods designed for able-bodied individuals, potentially leading to technological incompatibility.…”
Section: Design Considerations For Accessible Gesture Input In Vrmentioning
confidence: 99%
“…to improve gesture recognition accuracy [4,37,46,54]. However, these approaches primarily target input methods designed for able-bodied individuals, potentially leading to technological incompatibility.…”
Section: Design Considerations For Accessible Gesture Input In Vrmentioning
confidence: 99%
“…HARDWARE -With a focus on interaction rather than recognition, we built a simple glove to recognize thumb to finger and stretch microgestures, inspired from Wambecke et al [52] The glove contains fourteen sensors, four flex sensors and ten pressure sensors (see Fig. 2 for the layout of the sensors).…”
Section: Exploratory Experimentsmentioning
confidence: 99%
“…In this paper, we explore two modalities to reduce fatigue: eyegaze and microgestures, i.e fast and subtle finger movements [6,7]. The combination of gaze and microgestures has already been implemented for 2D selection of sparse targets displayed on a cockpit screen, and induced less fatigue than the use of cockpit physical controllers [52]. This paper examines whether this combination can be used for 3D selection in mixed reality.…”
Section: Introductionmentioning
confidence: 99%
“…The domain of microgesture interaction is very dynamic with a focus made on elaborating relevant microgesture sets through elicitation studies [13,40,45] and evaluating the end-users' performances [9,34,55,66]. Another area of research has been the design and implementation of systems and devices that sense and recognize such microgestures [30,51,68].…”
Section: Introductionmentioning
confidence: 99%