2011
DOI: 10.1142/s0219843611002629
|View full text |Cite
|
Sign up to set email alerts
|

Fusing Emg and Visual Data for Hands-Free Control of an Intelligent Wheelchair

Abstract: This paper presents a novel hands-free human machine interface (HMI) for elderly and disabled people by fusing multi-modality bioinformation abstracted from forehead electromyography (EMG) signals and facial images of a user. The interface allows users to drive an electric-powered wheelchair using face movements such as jaw clenching and eye blinking. An indoor environment is set up for evaluating the application of this interface. Five intact subjects participated in the experiment to drive the intelligent wh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
6
0

Year Published

2013
2013
2021
2021

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 13 publications
(6 citation statements)
references
References 11 publications
0
6
0
Order By: Relevance
“…In addition, researchers have used technologies originally developed for mobile robots to create smart wheelchairs that reduce the physical, perceptual, and cognitive skills necessary to operate a power wheelchair for individuals with severe dysfunction disorders such as amyotrophic lateral sclerosis (ALS), spinal cord injury (SCI), and muscle dystrophy (MS) [ 13 , 49 ]. Different kinds of input methods, such as joysticks [ 2 , 50 ], voice commands [ 51 , 52 ], the sip-and-puff interface [ 6 ], BCI [ 9 , 10 , 17 , 43 ], the tongue drive system (TDS) [ 7 , 8 , 32 , 33 , 34 , 35 ], the head gesture based interface (HGI) [ 53 , 54 ], the eye-controlled interface [ 3 , 39 , 55 , 56 , 57 , 58 , 59 , 60 , 61 ], the EMG-based interface [ 62 , 63 ], and the multimodal interface [ 64 , 65 ], have been used in EPW HMI to accommodate the disabled. Some examples of the remarkable technological advances in EPW HMI methodology in recent years are shown in detail in Figure 8 .…”
Section: Bibliometric Results and Discussionmentioning
confidence: 99%
“…In addition, researchers have used technologies originally developed for mobile robots to create smart wheelchairs that reduce the physical, perceptual, and cognitive skills necessary to operate a power wheelchair for individuals with severe dysfunction disorders such as amyotrophic lateral sclerosis (ALS), spinal cord injury (SCI), and muscle dystrophy (MS) [ 13 , 49 ]. Different kinds of input methods, such as joysticks [ 2 , 50 ], voice commands [ 51 , 52 ], the sip-and-puff interface [ 6 ], BCI [ 9 , 10 , 17 , 43 ], the tongue drive system (TDS) [ 7 , 8 , 32 , 33 , 34 , 35 ], the head gesture based interface (HGI) [ 53 , 54 ], the eye-controlled interface [ 3 , 39 , 55 , 56 , 57 , 58 , 59 , 60 , 61 ], the EMG-based interface [ 62 , 63 ], and the multimodal interface [ 64 , 65 ], have been used in EPW HMI to accommodate the disabled. Some examples of the remarkable technological advances in EPW HMI methodology in recent years are shown in detail in Figure 8 .…”
Section: Bibliometric Results and Discussionmentioning
confidence: 99%
“…Technologies that sense neuromuscular activation have also been investigated. For example, electromyography, which measures muscular activity [13,17,38,[47][48][49]; electrooculography, which measures eye movements [1,47]; and electroencephalography, which measures brain activity [5,13,31,51] can all be used for detecting user's intention for wheelchair control. These methods have great potential to help severely disabled individuals with very limited mobility.…”
Section: Related Workmentioning
confidence: 99%
“…Wei et al [49] published a usability study of a wheelchair control system relying on EMG signals and facial gesture recognition, to generate six discrete commands. Five users navigated a trajectory using this interface as well as using a joystick.…”
Section: Related Workmentioning
confidence: 99%
“…Fehr et al observe that close to 50% of the affected user group can be assisted if better control methods, with supplemented user interfaces and/or support systems capable of accommodating their needs and preferences, were employed. Huge research on joysticks and related interfaces including haptic systems has emerged [ 3 7 ], and new control models [ 8 , 9 ] are continuing to develop. The available driver models however suffer lack of individuality [ 10 ], focusing mostly on the common user attributes, and assume that all users respond to particular navigational situations by similar general patterns.…”
Section: Introductionmentioning
confidence: 99%