2007 IEEE International Symposium on Industrial Electronics 2007
DOI: 10.1109/isie.2007.4374877
|View full text |Cite
|
Sign up to set email alerts
|

Interface Framework to Drive an Intelligent Wheelchair Using Facial Expressions

Abstract: Many of the physically injured use electric wheelchairs as an aid to locomotion. Usually, for commanding this type of wheelchair, it is required the use of one's hands and this poses a problem to those who, besides being unable to use their legs, are also unable to properly use their hands. The aim of the work described here, is to create a prototype of a wheelchair command interface that do not require hand usage. Facial expressions were chosen instead, to provide the necessary visual information for the inte… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
13
0

Year Published

2008
2008
2018
2018

Publication Types

Select...
4
4
1

Relationship

1
8

Authors

Journals

citations
Cited by 24 publications
(14 citation statements)
references
References 9 publications
0
13
0
Order By: Relevance
“…The facial expression input makes use of image-processing algorithms to detect features, such as color se gmentation and edge detection, followed by th e application of a neu ral network to detec t the user 's desire. The res ults shown in Faria et al [39] provide evidence that comfortably driving an IW with the use of facial expressions is possible. However, such input still de monstrates some limitations regarding color segmentation (much sensitivity to lar ge light variations and slight color shifts) and shape extraction (improve precision without increasing the processing time).…”
Section: Discussionmentioning
confidence: 96%
“…The facial expression input makes use of image-processing algorithms to detect features, such as color se gmentation and edge detection, followed by th e application of a neu ral network to detec t the user 's desire. The res ults shown in Faria et al [39] provide evidence that comfortably driving an IW with the use of facial expressions is possible. However, such input still de monstrates some limitations regarding color segmentation (much sensitivity to lar ge light variations and slight color shifts) and shape extraction (improve precision without increasing the processing time).…”
Section: Discussionmentioning
confidence: 96%
“…Typically, an IW is controlled by a computer, it has a set of sensors and applies techniques derived from mobile robotics research in order to process the sensor information and generate the motors commands. The interface may consist of a conventional wheelchair joystick, voice based control, facial expressions (Faria et al, 2007) or even vision control, among others. The concept of IW is different from a conventional electric wheelchair, since in this latter case the user takes manual control over motor speed and direction via a joystick or other switch, without intervention by the wheelchair's control system.…”
Section: Definition and Characteristics Of An Intelligent Wheelchairmentioning
confidence: 99%
“…2(b). If the number of pixels falling into a particular region with a certain (Cb, Cr) value exceeds a threshold value, that pixel is considered as a skin color and the image is transformed to a binary image F using (2).…”
Section: A Skin Color Segmentationmentioning
confidence: 99%
“…In [1], [2], numerous systems are proposed to control a robot or a wheelchair using head or face movement. Such systems involve body movement and are not suitable for people with extreme physical disabilities where head or face movement is difficult.…”
Section: Introductionmentioning
confidence: 99%