Proceedings of the 21st ACM International Conference on Multimedia 2013
DOI: 10.1145/2502081.2502163
|View full text |Cite
|
Sign up to set email alerts
|

Hand and foot gesture interaction for handheld devices

Abstract: In this paper we present hand and foot based immersive multimodal interaction approach for handheld devices. A smart phone based immersive football game is designed as a proof of concept. Our proposed method combines input modalities (i.e. hand & foot) and provides a coordinated output to both modalities along with audio and video. In this work, human foot gesture is detected and tracked using template matching method and Tracking-Learning-Detection (TLD) framework. We evaluated our system's usability through … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 22 publications
(8 citation statements)
references
References 10 publications
0
8
0
Order By: Relevance
“…We have performed extensive test to confirm the intuitive and robustness of the hand/foot detection and tracking algorithm [Lu et al 2013] ] as well as evaluate the usability of the designed gestures and social acceptability of the demonstrations on a smartphone based hybrid wearable framework [Lv 2013], the designed touch-less gesture were perceived as 'applicable' and 'acceptable'. Future work will consist in tracking multiple body parts, for instance by applying multiple fingers tracking algorithms in parallel while keeping performance high.…”
Section: Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…We have performed extensive test to confirm the intuitive and robustness of the hand/foot detection and tracking algorithm [Lu et al 2013] ] as well as evaluate the usability of the designed gestures and social acceptability of the demonstrations on a smartphone based hybrid wearable framework [Lv 2013], the designed touch-less gesture were perceived as 'applicable' and 'acceptable'. Future work will consist in tracking multiple body parts, for instance by applying multiple fingers tracking algorithms in parallel while keeping performance high.…”
Section: Resultsmentioning
confidence: 99%
“…Presented application allows user to perform touch-less interaction by hand or foot gesture in front of the camera of google glass. Based on the same core ubiquitous gestures recognition algorithm as used in this demonstration, a hybrid wearable smartphone system based on mixed hardware and software has been presented in our previous work [Lv 2013] [Lu et al 2013] , which can support either hand or foot interaction with today' smartphone.…”
Section: Introductionmentioning
confidence: 99%
“…The football application was presented by and Lv et al [2014]. We also extended this method to other parts of the human body preliminarily, such as to the hand [Lu and ur Réhman 2013;]. …”
Section: Preliminary User Studymentioning
confidence: 99%
“…In the method proposed in [15] the main problem was the Kinect sensor which is by definition a non-mobile sensor in terms of computation weight and energy consumption. Paelke et al [18] and later Lu et al [19] have used computer vision like in [15] but change the sensor for the mobile's back cameras in order to recognize the foot and track its movements. There is still a series of problems with this method directly related to the use of vision.…”
Section: Techniques Used For Gesture Recognitionmentioning
confidence: 99%