2007
DOI: 10.1088/1742-6596/90/1/012094
|View full text |Cite
|
Sign up to set email alerts
|

Human–machine interface based on muscular and brain signals applied to a robotic wheelchair

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

0
26
0

Year Published

2009
2009
2021
2021

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 38 publications
(27 citation statements)
references
References 3 publications
0
26
0
Order By: Relevance
“…In considering the potential capabilities of these bioelectric signals, their extracted features can be used individually or combined in many research areas, especially in designing human-machine interfaces. It was found that even the recorded signals from facial muscle activities show noticeable pattern changes as subjects began to become tired from repeated smiling, nose wrinkling, or frowning, causing the bioelectric signals to gradually lose similarity among the patterns as time goes [10].…”
Section: Introductionmentioning
confidence: 99%
“…In considering the potential capabilities of these bioelectric signals, their extracted features can be used individually or combined in many research areas, especially in designing human-machine interfaces. It was found that even the recorded signals from facial muscle activities show noticeable pattern changes as subjects began to become tired from repeated smiling, nose wrinkling, or frowning, causing the bioelectric signals to gradually lose similarity among the patterns as time goes [10].…”
Section: Introductionmentioning
confidence: 99%
“…Both devices use high-level motion primitives (e.g., go to the kitchen) in a menu-based system. Another synchronous device is the Ferreira et al wheelchair,which uses the desynchronization of alpha rhythms in the visual cortex that occur when the eyes are open or closed [17]. 1 This desynchronization is used as a binary input to select low-level motion primitives (e.g., front, back, left, and right) in a sweeping menu-based system.…”
mentioning
confidence: 99%
“…Matsumoto and Ino et al [11] applied the recognition of head motion and eye gaze onto a locomotive wheelchair system. Ferreira and Silvaet al [12] proposed an HMI structure to control a robotic wheelchair by scalp EMG and EEG signals. Both eye blinking and eye close movements are used as control commands to control a mobile wheelchair through an onboard PDA.…”
Section: Introductionmentioning
confidence: 99%