2018
DOI: 10.1016/j.cmpb.2018.08.013
|View full text |Cite
|
Sign up to set email alerts
|

A facial expression controlled wheelchair for people with disabilities

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
29
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 62 publications
(29 citation statements)
references
References 35 publications
0
29
0
Order By: Relevance
“…Fernandes et al [16] present an interesting review of recent technologies for the special assistance and orientation of people who are visually impaired. Rabhi et al [17] propose performing recognition of facial expressions and using them as command modes for wheelchair control, where they use neural networks programmed into Raspberry Pi and an application installed on a Smartphone to capture the image and human-machine interaction, the results obtained range from 95% to 97.1% effective. Nandini and Seeja [18] present a new algorithm for easy navigation in a supermarket for visually impaired people, the results show that the algorithm was able to find the optimal path avoiding obstacles and minimizing distance and turns in the route.…”
Section: Introductionmentioning
confidence: 99%
“…Fernandes et al [16] present an interesting review of recent technologies for the special assistance and orientation of people who are visually impaired. Rabhi et al [17] propose performing recognition of facial expressions and using them as command modes for wheelchair control, where they use neural networks programmed into Raspberry Pi and an application installed on a Smartphone to capture the image and human-machine interaction, the results obtained range from 95% to 97.1% effective. Nandini and Seeja [18] present a new algorithm for easy navigation in a supermarket for visually impaired people, the results show that the algorithm was able to find the optimal path avoiding obstacles and minimizing distance and turns in the route.…”
Section: Introductionmentioning
confidence: 99%
“…Some other recent approaches incorporated voice commands and vocal feedbacks to share control decisions with the user such as obstacle avoidance, safe approach to objects, navigation through a specific path, and learn from these decisions [30]. For patients who cannot manipulate a standard joystick, many developments have been carried out to translate users' face and body gestures and eye movements to control commands for the wheelchair through visual feedbacks [31,32]. The same approach of gesture classification and recognition is carried out through gathering surface electromyography (EMG) [33,34] and electroencephalography (EEG) [35•] bio-signals.…”
Section: Assistive Mobile Robotsmentioning
confidence: 99%
“…User’s voice [ 30 ] and facial expression [ 31 ]-based wheelchair control systems were explored by different groups. However, voice control is laborious for the user and sound waves interference or noisy environment distractions can be introduced to the system establishing undesired commands.…”
Section: Background and Related Workmentioning
confidence: 99%