2007
DOI: 10.2139/ssrn.1089358
|View full text |Cite
|
Sign up to set email alerts
|

Trimmed Bagging

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
11
0
1

Year Published

2009
2009
2019
2019

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(12 citation statements)
references
References 10 publications
0
11
0
1
Order By: Relevance
“…They carried out experiments on benchmark datasets and recommend from the results that robust bagging performs quite similar compare with the standard bagging when applied to unstable base classifiers such as decision trees, but performs better when applied to more stable base classifiers as Fisher Linear Discriminant Analysis and Nearest Mean Classifier. Similarly, in Croux et al (2007) experiments, trimmed bagging performs comparably to standard bagging when applied to decision trees, but yields better results when applied to more stable base classifiers, like support vector machines. presented a study of different selective bagging techniques using decision stump as base classifier.…”
Section: Selective Bagging Variantsmentioning
confidence: 91%
“…They carried out experiments on benchmark datasets and recommend from the results that robust bagging performs quite similar compare with the standard bagging when applied to unstable base classifiers such as decision trees, but performs better when applied to more stable base classifiers as Fisher Linear Discriminant Analysis and Nearest Mean Classifier. Similarly, in Croux et al (2007) experiments, trimmed bagging performs comparably to standard bagging when applied to decision trees, but yields better results when applied to more stable base classifiers, like support vector machines. presented a study of different selective bagging techniques using decision stump as base classifier.…”
Section: Selective Bagging Variantsmentioning
confidence: 91%
“…The AP procedure adopted similar reduction rule as nice bagging (Nice) [25] and trimmed bagging (TB) [26] in which only good or "nice" bootstrap versions of the base models validated on out-ofbag samples were aggregated. Specially, those base models generated in the traditional bagging, which performed better than the rest ones according to certain decile value , were retained to comprise the final reduced set of base models.…”
Section: Accuracy-based Pruning Methods (Ap)mentioning
confidence: 99%
“…These two methods can be performed by a combination way in any order that finally comprised the two-stage strategy. The former, i.e., the AP procedure, used similar rule as the nice bagging [25] and the trimmed bagging [26] by excluding the worst classifiers and aggregated the rest. Specially, for all models established in the traditional bagging, those base models that had the highest prediction performance measured using accuracy (or the lowest error rates) validated on their out-of-bag samples were selected and retained.…”
Section: Introductionmentioning
confidence: 99%
“…Croux et al [37] propose the idea of trimmed bagging which aims to prune classifiers that yield the highest error rates, as estimated by the out-of-bag error rate. It has been shown that trimmed bagging performs comparably to standard bagging when applied to unstable classifiers as decision trees, but yields improved accuracy when applied to more stable base classifiers, like support vector machines.…”
Section: Post-combining Pruningmentioning
confidence: 99%