2013
DOI: 10.1016/j.patrec.2013.02.004
|View full text |Cite
|
Sign up to set email alerts
|

Experiencing real 3D gestural interaction with mobile devices

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
6
1
1

Relationship

3
5

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 24 publications
0
6
0
Order By: Relevance
“…The problems with direct-touch finger input are 'the occlusion problem' and 'the fat finger problem' [29]. Touch-less interaction extends the operation space resolution at least 5-10 times higher than 2D touch for interacting with mobiles [30] and helps mitigate the problems of using mobile phone while user's hands are not available, for instance, their hands are dirty, or they are wearing gloves in winter, or they are driving and holding the steering wheel [3], so that improving safely in contexts where switching visual attention between the device and the physical environment pose safety concerns, because touchless interaction doesn't require the user to manipulate any In addition, another motivation frequently mentioned is that touch-less interactions support more gestures in more natural patterns, which provide the less device-centric interaction techniques allowing a mobile user to focus attention on the task and its content instead of on the device while on the move. In this paper we explore and prototype a novel alternative touch-less approach to mobile interaction that uses a body-worn (i.e.…”
Section: Introductionmentioning
confidence: 99%
“…The problems with direct-touch finger input are 'the occlusion problem' and 'the fat finger problem' [29]. Touch-less interaction extends the operation space resolution at least 5-10 times higher than 2D touch for interacting with mobiles [30] and helps mitigate the problems of using mobile phone while user's hands are not available, for instance, their hands are dirty, or they are wearing gloves in winter, or they are driving and holding the steering wheel [3], so that improving safely in contexts where switching visual attention between the device and the physical environment pose safety concerns, because touchless interaction doesn't require the user to manipulate any In addition, another motivation frequently mentioned is that touch-less interactions support more gestures in more natural patterns, which provide the less device-centric interaction techniques allowing a mobile user to focus attention on the task and its content instead of on the device while on the move. In this paper we explore and prototype a novel alternative touch-less approach to mobile interaction that uses a body-worn (i.e.…”
Section: Introductionmentioning
confidence: 99%
“…In order to compare the effects of the augmented reality rendering interaction UI and pure graphic UI, a 'smile face' button at the right-top of the screen is developed to trigger hide/show status of the camera view, as in Figure 11. The stereoscopic 3D part is time consuming running on today's ordinary smartphone hardware, and the low performance will affect user experience [Yousefi et al 2013]. Therefore we didn't combine it in the proof-of-concept football game although we implement it on smartphone as a standalone.…”
Section: Multimodal Football Game -A Proof Of Conceptmentioning
confidence: 99%
“…However, to enable gesture-based 3D object manipulation, 27 DOF hand pose recovery is not necessarily needed. As a matter of fact, six motion parameters are sufficient to move and manipulate a graphical model in virtual/augmented reality applications (Yousefi et al, 2013). These six parameters, i.e.…”
Section: Novel Approachmentioning
confidence: 99%
“…In this experiment, grab gesture is used for object manipulation. The grab gesture is one of the most natural and frequently used gestures to manipulate objects in 3D space, and has been used to perform gesture-based interaction for mobile applications (Kondori et al, 2011a) (Yousefi et al, 2013). As shown in Fig.…”
Section: Gestural Interaction For Mobile Applicationsmentioning
confidence: 99%