2008
DOI: 10.1109/tnn.2008.2000395
|View full text |Cite
|
Sign up to set email alerts
|

A General Wrapper Approach to Selection of Class-Dependent Features

Abstract: Abstract-In this paper, we argue that for a C-class classification problem, C 2-class classifiers, each of which discriminating one class from the other classes and having a characteristic input feature subset, should in general outperform, or at least match the performance of, a C-class classifier with one single input feature subset. For each class, we select a desirable feature subset, which leads to the lowest classification error rate for this class using a classifier for a given feature subset search alg… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
42
0

Year Published

2009
2009
2024
2024

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 113 publications
(43 citation statements)
references
References 35 publications
1
42
0
Order By: Relevance
“…The second, and more promising approach, consists of using a wrapper method [47] coupled with an evolutionary instance selection method such as the one described in Algorithm 5. Similar methods have been used to train neural networks [38], [48], although without focusing on boosting the classifier.…”
Section: Constructing Ensembles Using Other Classifiersmentioning
confidence: 99%
“…The second, and more promising approach, consists of using a wrapper method [47] coupled with an evolutionary instance selection method such as the one described in Algorithm 5. Similar methods have been used to train neural networks [38], [48], although without focusing on boosting the classifier.…”
Section: Constructing Ensembles Using Other Classifiersmentioning
confidence: 99%
“…In the feature selection, we choose those representative vectors that can effectively capture the difference between the samples from different classes (in other words, they contain sufficient discriminative information) from three MPCA models. There are a great number of feature selection algorithms in the literature that can be roughly grouped into three categories: 1) filters [42]; 2) wrappers [44]; and 3) hybrid algorithms [45]. The filters usually select the features without needing the classifier performance, whereas the wrappers need the classifier performance to select the features.…”
Section: ) Motivation Of Feature Selection Scheme In Mpcamentioning
confidence: 99%
“…Most algorithms use either a sequential search (for example, [4,5,24,26,30]) or a global search (e.g., [11,23,[31][32][33][34][35]). On the basis of guiding the search strategies and evaluating the subsets, in contrast, the existing FS algorithms can be grouped into the following three approaches: wrapper (e.g., [4,6,30,[36][37][38]), filter (e.g., [40,41]), and hybrid (e.g., [23,42]). It is well-known that wrapper approaches always return features with a higher saliency than filter approaches, as the former utilize the association of features collectively during the learning process, but are computationally more expensive [2]).…”
Section: Existing Work For Feature Selectionmentioning
confidence: 99%
“…In the FS process, feature removal operations are performed sequentially, especially for those features that do not degrade accuracy of the NN upon removal. A class-dependent FS algorithm in [38], selects a desirable feature subset for each class. It first divides a C class classification problem into C two-class classification problems.…”
Section: Existing Work For Feature Selectionmentioning
confidence: 99%