Proceedings of the 4th International Conference on Mobile Technology, Applications, and Systems and the 1st International Sympo 2007
DOI: 10.1145/1378063.1378122
|View full text |Cite
|
Sign up to set email alerts
|

Eye-gaze interaction for mobile phones

Abstract: In this paper, we discuss the use of eye-gaze tracking technology for mobile phones. In particular we investigate how gaze interaction can be used to control applications on handheld devices. In contrast to eye-tracking systems for desktop computers, mobile devices imply several problems like the intensity of light for outdoor use and calibration issues. Therefore, we compared two different approaches for controlling mobile phones with the eyes: standard eye-gaze interaction based on the dwell-time method and … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
69
0
1

Year Published

2014
2014
2020
2020

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 124 publications
(71 citation statements)
references
References 8 publications
1
69
0
1
Order By: Relevance
“…A study was carried out by [20] on eye-gaze interaction for mobile phone use following two methods, the standard dwell-time based method and the gaze gesture method. Proposing to implement an eye tracker on a mobile phone platform, they further designed a number of gaze gestures which, upon recognition, can trigger certain actions such as scrolling up and down a phone book, opening or closing an internet browser.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…A study was carried out by [20] on eye-gaze interaction for mobile phone use following two methods, the standard dwell-time based method and the gaze gesture method. Proposing to implement an eye tracker on a mobile phone platform, they further designed a number of gaze gestures which, upon recognition, can trigger certain actions such as scrolling up and down a phone book, opening or closing an internet browser.…”
Section: Related Workmentioning
confidence: 99%
“…It has been proposed in the literature for disability assistance and other HCI purposes [37,38]. The realisation of remote control of a HCI system via gaze gestures is non-invasive, lowcost and efficient, involving 4 main stages: accurate eye centre localisation in the spatial-temporal domain, eye movement encoding [20], gaze gesture recognition [39] and HCI event activation.…”
Section: A Gaze Gesture Recognitionmentioning
confidence: 99%
“…[34] carried out a study on eye-gaze interaction for mobile phone use following two methods, the standard dwell-time based method and the gaze gesture method. This study concludes that gaze gesture is robust to head movement since it only captures relative eye movement rather than absolute eye fixation points.…”
Section: Eye/gaze Analysismentioning
confidence: 99%
“…However, the system is very sensitive to other movements of the face and thus of limited use in everyday activities, especially if the person being tracked has only limited control of his facial muscles. Electro-oculography, however, could be used to distinguish simple eye-based gestures (Drewes, De Luca, & Schmidt, 2007) where a high spatial precision is not required.…”
Section: Devices For Gaze Trackingmentioning
confidence: 99%
“…The logical next step has been taken by Vaitukaitis and Bulling (2012), who adopted the work of Drewes, De Luca & Schmidt (2007) and presented a first prototype on a portable device, an Android smartphone. In the near future, I envision handicapped users to have a personal mobile gaze-based interface, e.g.…”
Section: Beyond Verbal Communicationmentioning
confidence: 99%