2021
DOI: 10.1109/access.2021.3072195
|View full text |Cite
|
Sign up to set email alerts
|

Developing a Novel Hands-Free Interaction Technique Based on Nose and Teeth Movements for Using Mobile Devices

Abstract: Human-mobile interaction is aimed at facilitating interaction with the smartphone devices. The conventional way to interact with mobile devices is through manual input where most of the applications are made assuming that the end user has full control over their hand movements. However, this assumption excludes people who are unable to use their hands or have suffered limb damage. In this paper, we proposed a nose and teeth based interaction system, which allows the users to control their mobile devices comple… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(3 citation statements)
references
References 31 publications
0
3
0
Order By: Relevance
“…After that, the user can switch to the joystick mode to more precisely control where the pointer is located. If a non-rigid alteration is detected in the contour of the face, then any of the two modes can be activated or deactivated [19]. It uses standard infrared (IR) sensors to track head movements and a standard microcontroller (MCU) to map the sensor data to either relative movement (joystick mode) or an exact location on the screen (direct map-ping mode).…”
Section: Literature Reviewmentioning
confidence: 99%
See 1 more Smart Citation
“…After that, the user can switch to the joystick mode to more precisely control where the pointer is located. If a non-rigid alteration is detected in the contour of the face, then any of the two modes can be activated or deactivated [19]. It uses standard infrared (IR) sensors to track head movements and a standard microcontroller (MCU) to map the sensor data to either relative movement (joystick mode) or an exact location on the screen (direct map-ping mode).…”
Section: Literature Reviewmentioning
confidence: 99%
“…It uses standard infrared (IR) sensors to track head movements and a standard microcontroller (MCU) to map the sensor data to either relative movement (joystick mode) or an exact location on the screen (direct map-ping mode). Even though the way it works is pretty simple in theory, the system has to deal with several problems, such as optical noise from different sources, how people move their heads, accuracy, and power consumption [19].…”
Section: Literature Reviewmentioning
confidence: 99%
“…This system not only showed how nose and teeth movement can be interpreted for alternate means of interaction but also demonstrated the importance of system evaluation for the justification of such systems. Islam et al [ 34 ] developed and evaluated another nose tracking cursor control system but it was developed for using Android smartphones. A disabled user unable to use his/her hands can use the system to perform all the basic touch operations and button operations.…”
Section: Introductionmentioning
confidence: 99%