2008 19th International Conference on Database and Expert Systems Applications 2008
DOI: 10.1109/dexa.2008.103
|View full text |Cite
|
Sign up to set email alerts
|

Gestures, Shapes and Multitouch Interaction

Abstract: Abstract-We discuss issues related to the design of a multitouch gesture sensing environment, allowing the user to execute both independent and coordinated gestures. We discuss different approaches, comparing frontal vs. back projection devices and gesture tracking vs. shape recognition. To compare the approaches we introduce a simple gesture language for drawing diagrams. A test multitouch device built around FTIR technology is illustrated; a vision system, driven by a visual dataflow programming environment,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2010
2010
2011
2011

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 9 publications
0
1
0
Order By: Relevance
“…Increasingly, however, robots are controlled through more intuitive methods, such as more natural motions of hand, gestures, or voice. [3,9,10,23,24] Vision-based hand gesture controls typically use a camera to recognize hand gestures, and interpret them as commands for the robot. [9,23] Users need only to learn to use the gestures but not special hardware.…”
Section: Robot Control By Gesture Command and Tangible Objectmentioning
confidence: 99%
“…Increasingly, however, robots are controlled through more intuitive methods, such as more natural motions of hand, gestures, or voice. [3,9,10,23,24] Vision-based hand gesture controls typically use a camera to recognize hand gestures, and interpret them as commands for the robot. [9,23] Users need only to learn to use the gestures but not special hardware.…”
Section: Robot Control By Gesture Command and Tangible Objectmentioning
confidence: 99%