Twenty-First International Conference on Machine Learning - ICML '04 2004
DOI: 10.1145/1015330.1015432
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble selection from libraries of models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

2
488
1
9

Year Published

2012
2012
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 625 publications
(500 citation statements)
references
References 8 publications
2
488
1
9
Order By: Relevance
“…With this view in mind, when we are interested in extracting the best features (that is, choose the most informative base classifier values), one way is to do feature selection, where we keep some of the features and discard the rest. Actually, methods where people choose a subset of base classifiers from a large ensemble of candidates [27,34,9,35,11,49,42] do exactly this.…”
Section: Constructing New Uncorrelated Eigenclassifiersmentioning
confidence: 99%
See 4 more Smart Citations
“…With this view in mind, when we are interested in extracting the best features (that is, choose the most informative base classifier values), one way is to do feature selection, where we keep some of the features and discard the rest. Actually, methods where people choose a subset of base classifiers from a large ensemble of candidates [27,34,9,35,11,49,42] do exactly this.…”
Section: Constructing New Uncorrelated Eigenclassifiersmentioning
confidence: 99%
“…Related work 3.5.1. Selecting a subset of classifiers Instead of using all, choosing a subset may lead to higher accuracy and decreased complexity where the idea is to weed out the inaccurate or redundant (having low diversity) classifiers [54,9,35,42]. If the number of base classifiers is not high, a subset can be found by exhaustive search [36].…”
Section: Comparison With Adaboost and Baggingmentioning
confidence: 99%
See 3 more Smart Citations