Sixth International Conference on Machine Learning and Applications (ICMLA 2007) 2007
DOI: 10.1109/icmla.2007.35
|View full text |Cite
|
Sign up to set email alerts
|

Enhanced recursive feature elimination

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
103
0
1

Year Published

2014
2014
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 239 publications
(104 citation statements)
references
References 10 publications
0
103
0
1
Order By: Relevance
“…To do so, standard RFE removes a feature if it is weak or redundant at a particular step and retains independent and relevant features. EnRFE is a RFE-derived technique which redefines the criterion of removing features at each state of the training so that redundant or weak features that improve performance estimates when combined between them or with other relevant features can be retained [ 35 ]. In an iterative scheme, the SVM-EnRFE classifier is trained until a core set of variables with the highest discriminative power remains.…”
Section: Figmentioning
confidence: 99%
“…To do so, standard RFE removes a feature if it is weak or redundant at a particular step and retains independent and relevant features. EnRFE is a RFE-derived technique which redefines the criterion of removing features at each state of the training so that redundant or weak features that improve performance estimates when combined between them or with other relevant features can be retained [ 35 ]. In an iterative scheme, the SVM-EnRFE classifier is trained until a core set of variables with the highest discriminative power remains.…”
Section: Figmentioning
confidence: 99%
“…A linear kernel was used to represent the data, reduce computational cost and improve classification accuracy (i.e., the overall rate of correct classification). An enhanced recursive feature procedure (Chen and Jeong, 2007) was implemented to ensure that discrimination accuracy was not due to overfitting and to select the best predictive features (brain areas). Finally, we used leave-one-out cross-validation (Geisser, 1993) to test classification accuracy and whether the results were independent of the initial training data.…”
Section: Methodsmentioning
confidence: 99%
“…Segmentation is an essential process in GEOBIA, and the definition of segmentation parameters could have a significant influence on classification accuracy [19][20][21]. While the introduction of abundant features brings additional information that is relevant in target recognition, it could also introduce noise or redundant information that can degrade classification accuracy [22,23]. Moreover, several classification algorithms have been proposed and their performances (accuracy, efficiency and feasibility) in GEOBIA vary considerably [24][25][26].…”
Section: Introductionmentioning
confidence: 99%
“…It is also important to note that when the number of features is large, it is difficult to evaluate the accuracy of all possible feature combinations due to the huge computational work and time required. Thus for operational purposes, a series of search strategies are used in screening the most optimal feature subset [22,47]. However, few studies have evaluated the performance of these feature selection methods especially in GEOBIA crop recognition applications involving multisource satellite imagery.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation