2009
DOI: 10.1016/j.neucom.2008.12.035
|View full text |Cite
|
Sign up to set email alerts
|

Information-theoretic feature selection for functional data classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
26
0

Year Published

2010
2010
2019
2019

Publication Types

Select...
6
3
1

Relationship

2
8

Authors

Journals

citations
Cited by 53 publications
(26 citation statements)
references
References 10 publications
0
26
0
Order By: Relevance
“…And the selection procedure will be terminated if the number of selected features is larger then the user-specified threshold d. In this work, we mainly focused on the method for ranking the features. To determine the value of d, some alternative technologies, such as the inconsistent rate [39], permutation test [40] or wrappers [7], can be adopted. For each candidate feature f 2 F do (6) Calculate its value of criterion J(f) = R(f, class) Â w(f);…”
Section: Selection Algorithmmentioning
confidence: 99%
“…And the selection procedure will be terminated if the number of selected features is larger then the user-specified threshold d. In this work, we mainly focused on the method for ranking the features. To determine the value of d, some alternative technologies, such as the inconsistent rate [39], permutation test [40] or wrappers [7], can be adopted. For each candidate feature f 2 F do (6) Calculate its value of criterion J(f) = R(f, class) Â w(f);…”
Section: Selection Algorithmmentioning
confidence: 99%
“…A forward-backward sequential supervised feature selection algorithm 6,11 is implemented in the experiments as follows: during the forward step the selected feature subset S starts empty; at each iteration the feature f i that together with S has the largest MI with Y , is permanently added to S. The procedure continues until a given stopping criterion is reached. The backward step starts from the final subset S of the forward step.…”
Section: Methodsmentioning
confidence: 99%
“…Although each method has its advantages and disadvantages, mutual information has proven to be an appropriate measure in several applications such as selection of spectral variables, spectrometric nonlinear modelling and functional data classification, see Gomez-Verdejo et al (2009);Rossi et al (2007;. Moreover, as discussed in Cover & Thomas (1991), correlation does not measure nonlinear relations among features and wrapper approach presents a high computational…”
Section: Subset Relevant Assessmentmentioning
confidence: 99%