2021
DOI: 10.1007/978-3-030-92270-2_7
|View full text |Cite
|
Sign up to set email alerts
|

Gradient Boosting Forest: a Two-Stage Ensemble Method Enabling Federated Learning of GBDTs

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 16 publications
0
1
0
Order By: Relevance
“…However, it is important to note that most of this computation can be performed in parallel and thus only leads to a small overhead of computing the ensemble approximation and recalibration in real time. Some works have proposed methods for speeding up the training of ensembles, such as snapshot ensembles, 35,36 which could also be applied in this case. Another widely accepted advantage of ensembles is that they often improve prediction accuracy (see Fig.…”
Section: Discussionmentioning
confidence: 99%
“…However, it is important to note that most of this computation can be performed in parallel and thus only leads to a small overhead of computing the ensemble approximation and recalibration in real time. Some works have proposed methods for speeding up the training of ensembles, such as snapshot ensembles, 35,36 which could also be applied in this case. Another widely accepted advantage of ensembles is that they often improve prediction accuracy (see Fig.…”
Section: Discussionmentioning
confidence: 99%