2019 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct) 2019
DOI: 10.1109/ismar-adjunct.2019.000-7
|View full text |Cite
|
Sign up to set email alerts
|

HIGS: Hand Interaction Guidance System

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
7
1

Relationship

2
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 14 publications
0
3
0
Order By: Relevance
“…Traditional task guidance systems Pritchett, 1998, 2000) focused on providing the user with pre-loaded task-specific information without tracking the current state of the environment, or being able to generalize to new tasks (Leelasawassuk et al, 2017;Reyes et al, 2020;Lu and Mayol-Cuevas, 2019;Wang et al, 2016). The complexity of the problem comes from various aspects, such as environment understanding, object and action recognition, user's preference and mental state detection, real-time inference, etc (Manuvinakurike et al, 2018;Kim et al, 2022).…”
Section: Task Guidance Systemsmentioning
confidence: 99%
“…Traditional task guidance systems Pritchett, 1998, 2000) focused on providing the user with pre-loaded task-specific information without tracking the current state of the environment, or being able to generalize to new tasks (Leelasawassuk et al, 2017;Reyes et al, 2020;Lu and Mayol-Cuevas, 2019;Wang et al, 2016). The complexity of the problem comes from various aspects, such as environment understanding, object and action recognition, user's preference and mental state detection, real-time inference, etc (Manuvinakurike et al, 2018;Kim et al, 2022).…”
Section: Task Guidance Systemsmentioning
confidence: 99%
“…In addition to this, in the work from Chen et al [7], with the depth sensor, they compare the depth within the palm area and the depth around the palm area to determine weather the hand is touching an object. Similarly, Lu et al [8] use the same strategy and they plot an AUC (area under curve) curve to determine the best threshold for the criterion of 'touch' detection. Likitlersuang et al [9] put attention on higher level features, they build a hand-object interaction classifier with the optical flow and hand shape as input.…”
Section: Related Work a Hoi Detectionmentioning
confidence: 99%
“…Besides the object detector based method like [24,8]. In work from Chen et al [3], and Lu et al [15], in order to find a 'touch' with the machine, the depth sensor is used to measure the distance between the hand and machine, which is effective but inflexible. Likitlersuang et al [12] build a hand-object interaction classifier with the optical flow and hand shape as input.…”
Section: Hoi Detectionmentioning
confidence: 99%