2010
DOI: 10.1007/s10994-010-5224-5
|View full text |Cite
|
Sign up to set email alerts
|

Feature-subspace aggregating: ensembles for stable and unstable learners

Abstract: This paper introduces a new ensemble approach, Feature-Subspace Aggregating (Feating), which builds local models instead of global models. Feating is a generic ensemble approach that can enhance the predictive performance of both stable and unstable learners. In contrast, most existing ensemble approaches can improve the predictive performance of unstable learners only. Our analysis shows that the new approach reduces the execution time to generate a model in an ensemble through an increased level of localisat… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 40 publications
(23 citation statements)
references
References 23 publications
0
23
0
Order By: Relevance
“…Feating (Ting et al 2011) is a generic ensemble learning technique that also builds upon the ensembling strategy of AODE. Like AnDE, Feating operates by building a local model for each combination of n attribute values.…”
Section: Relationship To Featingmentioning
confidence: 99%
See 1 more Smart Citation
“…Feating (Ting et al 2011) is a generic ensemble learning technique that also builds upon the ensembling strategy of AODE. Like AnDE, Feating operates by building a local model for each combination of n attribute values.…”
Section: Relationship To Featingmentioning
confidence: 99%
“…In Sect. 3 we discuss how the AnDE algorithms relate to Feating (Ting et al 2011), a generic approach to ensembling that also builds upon techniques pioneered by AODE. In Sect.…”
Section: Introductionmentioning
confidence: 99%
“…Recently, a new ensemble technique, called feature-subspace aggregating (Feating) [24], was proposed that was shown to have nice performances. The key point of these ensemble methods is aggregating a large number of models built using sub-datasets randomly generated using for example bootstrapping.…”
Section: Model Population Analysis and Ensemble Learningmentioning
confidence: 99%
“…Near the end of the project we read a paper entitled Feature-subspace aggregating: ensembles for stable and unstable learners, [34]. Its core algorithm turned out to be inapplicable for sparse sequence data, but we were intrigued with the basic idea of trying to build CRFs that are expert in various "locations" in feature space.…”
Section: A15 Feature-class/label Transition Poolsmentioning
confidence: 99%