Recent Advances in Brain-Computer Interface Systems 2011
DOI: 10.5772/13935
|View full text |Cite
|
Sign up to set email alerts
|

Feature Extraction by Mutual Information Based on Minimal-Redundancy-Maximal-Relevance Criterion and Its Application to Classifying EEG Signal for Brain-Computer Interfaces

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
7
0

Year Published

2012
2012
2016
2016

Publication Types

Select...
2
1

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(7 citation statements)
references
References 28 publications
0
7
0
Order By: Relevance
“…Assume a random variable X representing continuous-valued random feature vector, and a discrete-valued random variable C representing the class labels. In accordance with Shannon's information theory, the uncertainty of the class label C can be measured by entropy H(C) as (Erfanian et al, 2011).…”
Section: Mutual Information Theory (Mi)mentioning
confidence: 99%
“…Assume a random variable X representing continuous-valued random feature vector, and a discrete-valued random variable C representing the class labels. In accordance with Shannon's information theory, the uncertainty of the class label C can be measured by entropy H(C) as (Erfanian et al, 2011).…”
Section: Mutual Information Theory (Mi)mentioning
confidence: 99%
“…The waves are sometimes classified on both frequency and on their shape. There are six types of important signals [4], [8][9][10][11][12]. 1) Beta waves: Frequency of these waves is between 13 and 30 Hz and the voltage or amplitude is very low about 5 to 30 µV.…”
Section: Signal Classificationmentioning
confidence: 99%
“…Indeed, MI is zero if and only if the two random variables are strictly independent (Erfanian et al, 2011).…”
Section: Mutual Information Theory (Mi)mentioning
confidence: 99%
“…It is equal if and only if one has independence between two variables C and X. The amount by which the class uncertainty is decreased is, by definition, the mutual information, I(X;C) = H(C) − H , and after applying the identities p(c,x) = p p(x) and can be expressed as: ( Erfanian et al, 2011).…”
Section: Mutual Information Theory (Mi)mentioning
confidence: 99%
See 1 more Smart Citation