2006 International Conference on Intelligent Engineering Systems
DOI: 10.1109/ines.2006.1689387
|View full text |Cite
|
Sign up to set email alerts
|

System I4Control®: Contactless control PC

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 1 publication
0
5
0
Order By: Relevance
“…From the existing works, gaze gestures (GG) are widely captured with either one of these three groups of input sensors as depicted in Table 1, depending on the application, availability, and cost. The first group addressed motion-based sensors for GG including EOG which works based on an electric potential difference between the retina and cornea causing changes in the electrostatic field [8]. These position changes were sensed by the electrodes attached to the users' skin closer to the eyes as in [9], [10], [11], and [12].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…From the existing works, gaze gestures (GG) are widely captured with either one of these three groups of input sensors as depicted in Table 1, depending on the application, availability, and cost. The first group addressed motion-based sensors for GG including EOG which works based on an electric potential difference between the retina and cornea causing changes in the electrostatic field [8]. These position changes were sensed by the electrodes attached to the users' skin closer to the eyes as in [9], [10], [11], and [12].…”
Section: Related Workmentioning
confidence: 99%
“…The third group is Video oculography (VOG) which is the most adopted nowadays because, it can capture the images of the subject eyes, estimate eyes positions, and point of gaze (POG) i.e where the user is looking [17], [18]. The first sub-group of VOG is camera-based, which detects pupil pose and converts them into coordinates [8], [2], [19], [20]. But camera-based suffers from interpolation, head movement, segmentation, identification, and is sometimes cumbersome.…”
Section: Related Workmentioning
confidence: 99%
“…People with physical impairments and disabilities rely on gaze-assisted interaction to communicate and perform basic operations on a computer [3]. Also, in the scenarios of situationally-induced impairments and disabilities, gaze-assisted interaction is crucial [27], [28]. While dwell-based selection has been majorly used in gaze-assisted interactions, using dwell has various limitations related to accuracy, performance, and usability [7], [29], [30].…”
Section: Related Workmentioning
confidence: 99%
“…The authors found that visual feedback significantly reduced the number of selection errors. Fejtová et al [28] created a system called ''I4Control'' which allows individuals with special needs to make non-contact control of a personal computer through the eye or head movement.…”
Section: B Accessible Interactionsmentioning
confidence: 99%
“…Various techniques were developed that enabled persons with motor-impairment to control the computer via emulation of the keyboard and the mouse, such as one-switch input [2], voice input [3], or eye tracking [1]. These methods are however tailored to the most common applications and interactions.…”
Section: Introductionmentioning
confidence: 99%