2008
DOI: 10.1155/2008/673040
|View full text |Cite
|
Sign up to set email alerts
|

A Minimax Mutual Information Scheme for Supervised Feature Extraction and Its Application to EEG-Based Brain-Computer Interfacing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2009
2009
2013
2013

Publication Types

Select...
3
3
2

Relationship

1
7

Authors

Journals

citations
Cited by 18 publications
(9 citation statements)
references
References 16 publications
0
9
0
Order By: Relevance
“…In order to meet this challenge, this paper introduces mutual information, which measures the mutual dependence of two random variables or reduction in uncertainty of random variables. The information-theoretic approach has recently received considerable attention in both the BCI [25], [31], [32] and the machine learning communities [33], [34], [35] for the selection of an informative subset from original features. Unlike that use of mutual information, in this paper, we consider it for the likelihood computation in the proposed Bayesian framework.…”
Section: Likelihood Estimation With Mutual Informationmentioning
confidence: 99%
“…In order to meet this challenge, this paper introduces mutual information, which measures the mutual dependence of two random variables or reduction in uncertainty of random variables. The information-theoretic approach has recently received considerable attention in both the BCI [25], [31], [32] and the machine learning communities [33], [34], [35] for the selection of an informative subset from original features. Unlike that use of mutual information, in this paper, we consider it for the likelihood computation in the proposed Bayesian framework.…”
Section: Likelihood Estimation With Mutual Informationmentioning
confidence: 99%
“…Let S ( t ) and D ( t ) , respectively, be a set of the candidate SFVPs and a current optimal spatial filter composed of the class‐discriminative SFVPs, which we call “Discriminative Spatial Filter Vector Pairs” (DSFVPs), at the t th iteration. The class‐discriminative power of DSFVPs is evaluated by means of mutual information between feature vectors and class labels that has been widely used in both the BCI (Zhang et al, ; Lan et al, ; Oveisi and Erfanian, ; Suk and Lee, ) and the machine learning communities (Kwak and Choi, ; Peng et al, ; Leiva‐Murillo and Artés‐Rodríguez, ) for feature selection. The computational issue for mutual information is covered in the next subsection.…”
Section: Class‐discriminative Spatial Filter Selectionmentioning
confidence: 99%
“…It is also assumed that both the mixture variables and the independent components have zero mean (Oveisi et al, 2008).…”
Section: Genetic Algorithmmentioning
confidence: 99%