2002
DOI: 10.1006/ijhc.2002.1012
|View full text |Cite
|
Sign up to set email alerts
|

Vision-based user interfaces: methods and applications

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
0

Year Published

2005
2005
2022
2022

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 77 publications
(29 citation statements)
references
References 118 publications
0
29
0
Order By: Relevance
“…[31]. When interacting with such objects user input is typically preceded by movement of gaze to areas of interest.…”
Section: Figure 3 Example Of Eye Tracking Datamentioning
confidence: 99%
“…[31]. When interacting with such objects user input is typically preceded by movement of gaze to areas of interest.…”
Section: Figure 3 Example Of Eye Tracking Datamentioning
confidence: 99%
“…Identifying is the action of interpreting the event or the phenomenon; for example, the user is executing a gesture with such (geometric) features. Tracking is the result of observing the evolution of the phenomenon, computing its actual state and relating it with the past states; for example, the user has drawn a shape recognized by the system [13].…”
Section: Gesture Vs Shape Recognitionmentioning
confidence: 99%
“…Extensive surveys have been published in several HCI fields such as face detection (Yang et al, 2002), face recognition (Zhao et al, 2003), facial expression analysis (Fasel and Luettin, 2003), gesture recognition (Pavlovic et al, 1997, Mitra andAcharya, 2007), human motion analysis (Gavrila, 1999, Aggarwal and Cai, 1999, Wang et al, 2003. A survey presenting the use of vision in HCI, particularly in the area of head tracking can be found in work (Porta, 2002, Kisacanin, 2005. The authors of work (Jaimes and Sebe, 2007) give an overview of multimodal human machine interfaces.…”
Section: Current Researchmentioning
confidence: 99%