2021
DOI: 10.1145/3462546
|View full text |Cite
|
Sign up to set email alerts
|

An Exploration of Freehand Crossing Selection in Head-Mounted Augmented Reality

Abstract: Crossing, or goal crossing, has proven useful in various selection scenarios, including pen, mouse, touch, and virtual reality (VR). However, crossing has not been exploited for freehand selection using augmented reality head-mounted displays (AR HMDs). Using the HoloLens, we explore freehand crossing for selection and compare it to the state-of-the-art “gaze and commit” (head gaze) method. We report on three studies investigating freehand crossing in multiple use cases. The first study shows that crossing out… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
7
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 10 publications
(7 citation statements)
references
References 66 publications
0
7
0
Order By: Relevance
“…Tung and Hertzmann (2018) noted that users actually perform many 2D tasks in a 3D environment, such as buttons, sliders, and other widgets. Uzor and Kristensson (2021) found that 2D image planes have been shown to facilitate relatively simple 2D choices in a 3D environment. Therefore, it is feasible to use 2DUI in a virtual environment.…”
Section: Dui In Virtual Environmentmentioning
confidence: 97%
See 2 more Smart Citations
“…Tung and Hertzmann (2018) noted that users actually perform many 2D tasks in a 3D environment, such as buttons, sliders, and other widgets. Uzor and Kristensson (2021) found that 2D image planes have been shown to facilitate relatively simple 2D choices in a 3D environment. Therefore, it is feasible to use 2DUI in a virtual environment.…”
Section: Dui In Virtual Environmentmentioning
confidence: 97%
“…Research on designing user interface (UI) interaction is a major focus in VR development. Many studies have been conducted to compare different interaction modes of VR (Argelaguet and Andujar, 2013;Bergström et al, 2021;Uzor and Kristensson, 2021), however, designing such modes specifically for the VR environment is time-consuming. Currently, many 3D scenes are available on the web.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Other input modalities, such as head motion, hand gesture, and eye gazing, could also be used to point and select small remote targets [10]. Besides virtual hands and pointing selection techniques, other selection methods, such as goal-crossing selection enabled by freehand tracking of the Microsoft Hololens, have also been evaluated in OST AR environments [9].…”
Section: Study 1: Freehand Gestural Selection In Wearable Ost Armentioning
confidence: 99%
“…For example, selecting a target is one of the most-frequent user interaction tasks. Target selection has been extensively studied in a variety of interaction contexts, including desktop computers [3,4], multi-touch mobile devices [5], 3D displays with hand-held devices [6], or freehand gestures [7][8][9][10]. In particular, freehand gestural interaction, which could provide a direct and natural user experience, is already expanding outside lab settings and is commonly supported by commercial wearable OST AR products.…”
Section: Introductionmentioning
confidence: 99%