Proceedings of the 23rd Conference on l'Interaction Homme-Machine 2011
DOI: 10.1145/2044354.2044360
|View full text |Cite
|
Sign up to set email alerts
|

Using the user's point of view for interaction on mobile devices

Abstract: RESUMENous présentons une technique d'interaction pour dispositifs mobiles (smartphone et tablette) basée sur le suivi du visage de l'utilisateur. Cette technique définit de nouvelles possibilités pour l'interaction en entrée et en sortie sur dispositifs mobiles. En sortie, le suivi de la tête peut permettre de contrôler le point de vue sur une scène 3D affichée à l'écran (Head-Coupled Perspective, HCP). Cette technique améliore l'interaction en sortie en offrant la perception de la profondeur et en permettant… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
31
0

Year Published

2014
2014
2021
2021

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 43 publications
(32 citation statements)
references
References 29 publications
0
31
0
Order By: Relevance
“…However, user's point of view can be speculated not only by utilizing sensors' information, but also by face tracking. Thus, Francone and Nigay [24] computed the position of the device according to the user's head and use it to control the viewpoint on a 3D scene. However, the 360 degrees of the scene can be seen only when using sensors.…”
Section: D Mmmis-explicit Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…However, user's point of view can be speculated not only by utilizing sensors' information, but also by face tracking. Thus, Francone and Nigay [24] computed the position of the device according to the user's head and use it to control the viewpoint on a 3D scene. However, the 360 degrees of the scene can be seen only when using sensors.…”
Section: D Mmmis-explicit Related Workmentioning
confidence: 99%
“…Francone and Nigay [24] allow users to interchange digital information among their portable computers, table and wall displays, as well as other physical objects through hyper-dragging. A proposed interaction technique of hyper-dragging is that users can easily share information like a picture or video by using a cursor to drag them to the physical place where they want to upload the information [25].…”
Section: D Mmmis-explicit Related Workmentioning
confidence: 99%
“…Similarly, Francone and Nigay [4] performed a study on mobile devices using facial tracking to mimic user perception.…”
Section: A Related Workmentioning
confidence: 99%
“…There were two independent variables: input method (tilt-input, facial tracking) and life (1,2,3,4,5). The dependent variables were survival time, stars collected, and score.…”
Section: Designmentioning
confidence: 99%
“…Head tracking, on the other hand, is readily available in most CAVE-like systems and has been successfully integrated in largedisplay [24], video-conference [38], mobile [37], surface [9] and floor-projected [32] interactive systems. In a real-life scenario, Stiefelhagen and Zhu [38] showed that head orientation contributed 68.9% to the overall gaze direction and could estimate attention focus with 88.7% accuracy.…”
Section: Gazementioning
confidence: 99%