1998
DOI: 10.1002/(sici)1097-4571(19980415)49:5<423::aid-asi5>3.3.co;2-s
|View full text |Cite
|
Sign up to set email alerts
|

Feature selection and effective classifiers

Abstract: In this article, we develop and analyze four algorithmspatterns from large databases. As described in Fayyad for feature selection in the context of rough set method- (1996) and Simoudis (1996), this process is typically ology. The initial state and the feasibility criterion of all made up of selection and sampling, preprocessing and these algorithms are the same. That is, they start with a cleaning, transformation and reduction, data mining, and given feature set and progressively remove features, evaluation… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2000
2000
2022
2022

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 24 publications
(2 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…This feature space consists of descriptive features used for the classification step in a complete system for flaw detection. Unlike the cost matrix theory, 13 or the methods of Celeux and Lechevallier 14 or Deogun et al, 15 this description space enables the selection of a set of appropriate descriptive features without taking into account any classifier.…”
Section: Description Of the Methodsmentioning
confidence: 99%
“…This feature space consists of descriptive features used for the classification step in a complete system for flaw detection. Unlike the cost matrix theory, 13 or the methods of Celeux and Lechevallier 14 or Deogun et al, 15 this description space enables the selection of a set of appropriate descriptive features without taking into account any classifier.…”
Section: Description Of the Methodsmentioning
confidence: 99%
“…Previous methods 0167-8655/$ -see front matter Ó 2009 Elsevier B.V. All rights reserved. doi:10.1016/j.patrec.2009.10.013 employed an incremental hill-climbing (greedy) algorithm to select feature (Hu, 1995;Deogun et al, 1998). However, this often led to a non-minimal feature combination.…”
Section: Introductionmentioning
confidence: 99%