Proceedings 30th Applied Imagery Pattern Recognition Workshop (AIPR 2001). Analysis and Understanding of Time Varying Imagery
DOI: 10.1109/aipr.2001.991206
|View full text |Cite
|
Sign up to set email alerts
|

A basic hand gesture control system for PC applications

Abstract: We discuss the issues involved in controlling computer applications via gestures composed of both static symbols and dynamic motions. Each gesture is modeled from either static model information or a linear-in-parameters dynamic system. Recognition occurs in a real-time environment using a small amount of processing time and memory. We will examine which gestures are appropriate, how the gestures can be recognized, and which commands the gestures should control. The tracking method is detailed, along with its … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(5 citation statements)
references
References 2 publications
0
5
0
Order By: Relevance
“…These are typically detected using either RGB, infrared (IR), or depth cameras. There exists a large body of work focusing on detecting and tracking hands and fingers to enable multi-touch and mid-air gestures using RGB cameras [7,21,18,25]. Such systems typically use skin color detectors [18] or template matching [21] to segment the hand and then calculate contour and convexity defects [25] to identify fingers.…”
Section: Interaction With Projected Surfacesmentioning
confidence: 99%
“…These are typically detected using either RGB, infrared (IR), or depth cameras. There exists a large body of work focusing on detecting and tracking hands and fingers to enable multi-touch and mid-air gestures using RGB cameras [7,21,18,25]. Such systems typically use skin color detectors [18] or template matching [21] to segment the hand and then calculate contour and convexity defects [25] to identify fingers.…”
Section: Interaction With Projected Surfacesmentioning
confidence: 99%
“…To enable multi-touch interfaces using RGB cameras [4,12,37] there has been substantial work in image segmentation that tracks and identifies various body parts [20]. Typically these systems use skin color matching, edge or contour detection, and motion tracking to segment fingers and hands [12].…”
Section: Surface and Gesture Interactionmentioning
confidence: 99%
“…Lenman et al explored the use of pie-and marking menus in hand gesturebased interaction [16]. Cohen et al studied the issues involved in controlling computer applications via hand gestures composed of both static and dynamic symbols [7]. Head pose and gesture offer several key conversational grounding cues and are used extensively in face-to-face interaction among people.…”
Section: Related Workmentioning
confidence: 99%