Machine Learning Proceedings 1994 1994
DOI: 10.1016/b978-1-55860-335-6.50023-4
|View full text |Cite
|
Sign up to set email alerts
|

Irrelevant Features and the Subset Selection Problem

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

5
996
0
47

Year Published

1996
1996
2020
2020

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 1,673 publications
(1,048 citation statements)
references
References 21 publications
5
996
0
47
Order By: Relevance
“…A fim de trazê-las para mesma ordem de grandeza, todas variáveis foram padronizadas estatisticamente. Para selecionar as variáveis que possuem maior poder de discriminação entre as classes de solo, foi utilizado o método de seleção sequencial progressiva (Sequential Forward Selection -SFS) (JOHN et al, 1994) no software WEKA 3.5.8 (WITTEN & FRANK, 2005). …”
unclassified
“…A fim de trazê-las para mesma ordem de grandeza, todas variáveis foram padronizadas estatisticamente. Para selecionar as variáveis que possuem maior poder de discriminação entre as classes de solo, foi utilizado o método de seleção sequencial progressiva (Sequential Forward Selection -SFS) (JOHN et al, 1994) no software WEKA 3.5.8 (WITTEN & FRANK, 2005). …”
unclassified
“…In general, selected features induce better decision trees than original features. For CorrAl (John, Kohavi and P eger 1994), Bupa, and Abalone, both tree size and error rate are reduced after feature selection. Without feature selection the decision tree for CorrAl picks the correlated feature C as the root.…”
Section: Methodsmentioning
confidence: 99%
“…The filter schemes are independent of the induction algorithm. In scheme (3), the relationship is taken the other way around: it is the FSA that uses the learning algorithm as a subroutine (John et al, 1994). It employs a search through the space of feature subsets using the estimated accuracy from an induction algorithm as the measure of goodness for a particular feature subset.…”
Section: Feature Selectionmentioning
confidence: 99%