2016
DOI: 10.18637/jss.v074.i01
|View full text |Cite
|
Sign up to set email alerts
|

gamboostLSS: An R Package for Model Building and Variable Selection in the GAMLSS Framework

Abstract: Generalized additive models for location, scale and shape are a flexible class of regression models that allow to model multiple parameters of a distribution function, such as the mean and the standard deviation, simultaneously. With the R package gamboostLSS, we provide a boosting method to fit these models. Variable selection and model choice are naturally available within this regularized regression framework. To introduce and illustrate the R package gamboostLSS and its infrastructure, we use a data set on… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
74
0
1

Year Published

2017
2017
2024
2024

Publication Types

Select...
5

Relationship

3
2

Authors

Journals

citations
Cited by 69 publications
(81 citation statements)
references
References 37 publications
0
74
0
1
Order By: Relevance
“…We used the selected sample to estimate the model and the balance of the data in each sample to determine the out‐of‐bag prediction accuracy (empirical risk) measured by the negative log‐likelihood of each model; the optimal stopping iteration (mfalse^stop) is the iteration with the lowest average empirical risk. In boosted GAMLSS models we used multi‐dimensional subsampling to determine the stopping iteration for each of the GAMLSS parameters while allowing for potentially different model complexities in the parameters; a detailed explanation of this cross‐validation (subsampling) scheme is given in Hofner, Mayr, et al ().…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…We used the selected sample to estimate the model and the balance of the data in each sample to determine the out‐of‐bag prediction accuracy (empirical risk) measured by the negative log‐likelihood of each model; the optimal stopping iteration (mfalse^stop) is the iteration with the lowest average empirical risk. In boosted GAMLSS models we used multi‐dimensional subsampling to determine the stopping iteration for each of the GAMLSS parameters while allowing for potentially different model complexities in the parameters; a detailed explanation of this cross‐validation (subsampling) scheme is given in Hofner, Mayr, et al ().…”
Section: Methodsmentioning
confidence: 99%
“…In boosted GAMLSS models we used multi-dimensional subsampling to determine the stopping iteration for each of the GAMLSS parameters while allowing for potentially different model complexities in the parameters; a detailed explanation of this cross-validation (subsampling) scheme is given in Hofner, Mayr, et al (2016).…”
Section: Modeling Approachmentioning
confidence: 99%
See 1 more Smart Citation
“…The range of the grid (minimum and maximum m stop ) has to be specified ad-hoc, however it might be necessary to adapt it based on the results (adaptive grid search). For a more detailed discussion on how to select this grid in practice, we refer to Hofner et al (2016). Finally, the combination of stopping iterations from the grid is selected, which yields the smallest empirical risk on test-data (e.g., via cross-validation or resampling procedures).…”
Section: Boosting Joint Modelsmentioning
confidence: 99%
“…For a more detailed discussion on how to select this grid in practice, we refer to Hofner et al (2016). Finally, the combination of stopping iterations from the grid is selected, which yields the smallest empirical risk on test-data (e.g.…”
Section: Boosting Joint Modelsmentioning
confidence: 99%