2009
DOI: 10.1007/978-3-642-04697-1_27
|View full text |Cite
|
Sign up to set email alerts
|

Image Categorization Using ESFS: A New Embedded Feature Selection Method Based on SFS

Abstract: Abstract. Feature subset selection is an important subject when training classifiers in Machine Learning (ML) problems. Too many input features in a ML problem may lead to the so-called "curse of dimensionality", which describes the fact that the complexity of the classifier parameters adjustment during training increases exponentially with the number of features. Thus, ML algorithms are known to suffer from important decrease of the prediction accuracy when faced with many features that are not necessary. In … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
4
4
2

Relationship

1
9

Authors

Journals

citations
Cited by 18 publications
(6 citation statements)
references
References 18 publications
0
6
0
Order By: Relevance
“…The embedded method [6] integrates the feature generation strategy and the classification of the quality of the selected subset into the learning algorithm, thus the link established between feature selection and classification is stronger than in the wrapper method [22]. This leads to a combination of the advantages of filter and wrapper based methods with a reduced computational time [9,22]. The implicit feature generation built into learning algorithms such as decision tree induction algorithms makes CART, ID3 and C4.5 embedded learning methods [41].…”
Section: Embedded Methodsmentioning
confidence: 99%
“…The embedded method [6] integrates the feature generation strategy and the classification of the quality of the selected subset into the learning algorithm, thus the link established between feature selection and classification is stronger than in the wrapper method [22]. This leads to a combination of the advantages of filter and wrapper based methods with a reduced computational time [9,22]. The implicit feature generation built into learning algorithms such as decision tree induction algorithms makes CART, ID3 and C4.5 embedded learning methods [41].…”
Section: Embedded Methodsmentioning
confidence: 99%
“…Used as a filter method, we carried out experiments aiming at comparing the behavior of our ESFS with other filter featureselection techniques, including Fischer filter method, PCA, and SFS. Using Berlin dataset for emotional speech recognition and Simplicity dataset for visual object recognition, our ESFS displayed better performance, showing its effectiveness in the selection of relevant features [35].…”
Section: T-norm a T-norm Is A Functionmentioning
confidence: 97%
“…They have the advantages of wrapper methods that the selection of feature subsets has an interaction with the learning tasks, so the selected features tend to be more effective than those generated by filter methods. Furthermore, they are less computationally complex than wrappers, as the feature selection is directly included in the construction of a learning model during the training process [60]. However, they are more complex conceptually, and modifications to the learning algorithm may cause poor performance [113].…”
Section: Embedded Approachesmentioning
confidence: 99%