2009 IEEE/RSJ International Conference on Intelligent Robots and Systems 2009
DOI: 10.1109/iros.2009.5354534
|View full text |Cite
|
Sign up to set email alerts
|

Navigating a smart wheelchair with a brain-computer interface interpreting steady-state visual evoked potentials

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
61
0

Year Published

2011
2011
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 78 publications
(61 citation statements)
references
References 23 publications
0
61
0
Order By: Relevance
“…There are three papers about BCWs with low-level navigation (41,48,50), one with high-level navigation (58), and two with shared control (30,55). Although the SSVEP BCWs are based on the selection of a visual stimulus, such as the P300-BCW, we found a clear predominance of prototypical low-level navigation and shared-control systems with four or five commands.…”
mentioning
confidence: 70%
“…There are three papers about BCWs with low-level navigation (41,48,50), one with high-level navigation (58), and two with shared control (30,55). Although the SSVEP BCWs are based on the selection of a visual stimulus, such as the P300-BCW, we found a clear predominance of prototypical low-level navigation and shared-control systems with four or five commands.…”
mentioning
confidence: 70%
“…In [6] band pass filter of 1-35Hz is used. In [14] signal measured is linearly combined with a optimal weight vector to cancel out noise. In [17] signals are filtered using a band pass filter between 0.1Hz-35Hz…”
Section: B Pre-processingmentioning
confidence: 99%
“…In [10] data is filtered using moving average technique and down sampled by a factor of 16 and all the data from 16 channel is concatenated creating a single feature vector for classification algorithm and uses a stepwise linear discriminate analysis. In [13] 1 second of EEG data is extracted after each stimulus onset and then filtered using moving average method and down sample by a factor of 16,here the number of channel selected varies from six to ten depending on the participant so if ten channel were selected the feature vector length will be 265/16 *10 channels since it is sample at 265hz, stepwise linear discriminate analysis is used to classify P300 obtaining a performance higher than 90%.In [14] the power of each stimulus frequency is calculated from the acquired brain signal and used a linear classifier to classify in which the subject is focused on, command are considered only when they exceeds a particular threshold and if more than one power have exceeded the threshold, the frequency with the highest power is classified. In [16] context based menus of command are displayed to the user and the menus flashes randomly and then EEG data from10ms to 500ms is extracted and fed into a support vector machine which outputs new score for each button, when the score exceeds the threshold it is issue as a command, when one or more button crosses the threshold the menu with highest score is issued as a command.…”
Section: Feature Extraction and Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…These direct, intuitive controls make interaction between human and machine much faster and convenient. They are widely used in prostheses [1], orthoses [2], robotic arms [3] and wheel chairs for disabled people [4].…”
Section: Introductionmentioning
confidence: 99%