2021
DOI: 10.1016/j.ijhcs.2021.102676
|View full text |Cite
|
Sign up to set email alerts
|

EyeTAP: Introducing a multimodal gaze-based technique using voice inputs with a comparative analysis of selection techniques

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(3 citation statements)
references
References 12 publications
0
3
0
Order By: Relevance
“…This makes it challenging to locate the gaze position accurately and thus can trigger the Midas touch problem. The key to tackling the Midas touch problem is to analyse the eye movement metrics through the cognitive process and separate users' true intention from the unintentional activities [51,61,183,199]. The other applications that combine fixation and eye-opening/closing actions to select and manipulate objects can be further explored.…”
Section: Discussionmentioning
confidence: 99%
“…This makes it challenging to locate the gaze position accurately and thus can trigger the Midas touch problem. The key to tackling the Midas touch problem is to analyse the eye movement metrics through the cognitive process and separate users' true intention from the unintentional activities [51,61,183,199]. The other applications that combine fixation and eye-opening/closing actions to select and manipulate objects can be further explored.…”
Section: Discussionmentioning
confidence: 99%
“…Miniotas et al (2006) combined voice control with the eye-control system to improve the accuracy of pointing at small target controls. The EyeTAP system proposed by Parisay et al (2020) replaced voice control with sound pulse recognition to render the system more robust under environmental noise disturbances. In addition, some researchers introduced special facial interactions, such as breathing and lip-speaking, into the eye-control system (Su et al, 2021;Onishi et al, 2022).…”
Section: Eye Interaction Movement Researchmentioning
confidence: 99%
“…The purpose of the first evaluation was to study the ESPiM model. In this study, we used the unpublished datasets of our previous paper EyeTAP [35] in which we collected large amounts of infrared eye-tracking data.…”
Section: User Study 1: Fitts' Study (In-person)mentioning
confidence: 99%