2014
DOI: 10.3414/me13-01-0123
|View full text |Cite
|
Sign up to set email alerts
|

Extending Statistical Boosting

Abstract: SummaryBackground: Boosting algorithms to simultaneously estimate and select predictor effects in statistical models have gained substantial interest during the last decade. Objectives: This review highlights recent methodological developments regarding boosting algorithms for statistical modelling especially focusing on topics relevant for biomedical research. Methods: We suggest a unified framework for gradient boosting and likelihood-based boosting (statistical boosting) which have been addressed separately… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
10
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 40 publications
(10 citation statements)
references
References 85 publications
(108 reference statements)
0
10
0
Order By: Relevance
“…The ensemble superscript ( j ) might be left out in the following text for brevity, especially when talking about a single Gentle Boost ensemble whose position within the multi-model is unimportant. Other methods which could be incorporated in our collaborative strategy include gradient boosting and likelihood-based boosting (statistical boosting) algorithms37, as well as algorithms that optimize non-convex potential functions instead of the traditional exponential loss function4445.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The ensemble superscript ( j ) might be left out in the following text for brevity, especially when talking about a single Gentle Boost ensemble whose position within the multi-model is unimportant. Other methods which could be incorporated in our collaborative strategy include gradient boosting and likelihood-based boosting (statistical boosting) algorithms37, as well as algorithms that optimize non-convex potential functions instead of the traditional exponential loss function4445.…”
Section: Methodsmentioning
confidence: 99%
“…Boosting algorithms can be modified such that they contain an intrinsic mechanism for variable selection and model choice (component-wise learning35). Recently, Mayr et al 37. provide comprehensive overviews on the evolution of boosting algorithms, as well as on extending statistical boosting.…”
mentioning
confidence: 99%
“…In doing so, they update and expand earlier reviews of this specific area of statistical learning research [46]. For the first time, recent methodologic research on boosting functional data and on the application of boosting techniques in advanced survival modelling is reviewed.…”
mentioning
confidence: 74%
“…Likelihood-based boosting [ 3 , 29 ] is the other general approach in the framework of statistical boosting algorithms; it received much attention particularly in the context of high-dimensional biomedical data (see [ 11 ] and the references therein). Although it follows a very similar structure to gradient boosting (see Box 1 ), both approaches only coincide in special cases such as classical Gaussian regression via the L 2 loss [ 1 , 30 ].…”
Section: Statistical Boostingmentioning
confidence: 99%
“…An accompanying article [ 11 ] highlighted the multiple extension of the basic algorithms towards (i) enhanced variable selection properties, (ii) new types of predictor effects, and (iii) new regression settings. Substantial methodological developments on statistical boosting algorithms throughout the last few years (e.g., stability selection [ 12 ]) and a growing community have opened the door to new model classes and frameworks (e.g., joint models [ 13 ] and functional data [ 14 ]), asking for an up-to-date review on the available extensions.…”
Section: Introductionmentioning
confidence: 99%