2019
DOI: 10.1145/3365224
|View full text |Cite
|
Sign up to set email alerts
|

Hierarchical Ensemble Reduction and Learning for Resource-constrained Computing

Abstract: Generic tree ensembles (such as Random Forest, RF) rely on a substantial amount of individual models to attain desirable performance. The cost of maintaining a large ensemble could become prohibitive in applications where computing resources are stringent. In this work, a hierarchical ensemble reduction and learning framework is proposed. Experiments show our method consistently outperforms RF in terms of both accuracy and retained ensemble size. In other words, ensemble reduction is achieved with enhancement … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 24 publications
0
2
0
Order By: Relevance
“…The translator proposed is not a classical one-pass or twopass translator [50]. It is a translator adapted to the generation of the NoC communication subsystem.…”
Section: A Generation Of a Low-level Model Of The Noc Communication S...mentioning
confidence: 99%
“…The translator proposed is not a classical one-pass or twopass translator [50]. It is a translator adapted to the generation of the NoC communication subsystem.…”
Section: A Generation Of a Low-level Model Of The Noc Communication S...mentioning
confidence: 99%
“…This is consistent with Mitchell et al (2002), who used EnKF with an ensemble size of 64 to assimilate radiosonde, satellite, and aircraft data into a dry, global, primitive-equation model. Over the past decade, the boost in computational power has opened the door to new approaches for improving simulation performance, including bagging ensemble filters, entropy ensemble filters, and hierarchical ensemble filters (Foroozand and Weijs, 2017;Wang et al, 2020).…”
Section: Sensitive Parameters Of the Enkf Assimilationmentioning
confidence: 99%