2021
DOI: 10.48550/arxiv.2106.01574
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Multiple Imputation Through XGBoost

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(5 citation statements)
references
References 20 publications
0
5
0
Order By: Relevance
“…MICE with default settings (van Buuren and Groothuis Oudshoorn, 2011) would produce unsatisfactory results unless users manually specify any potential non-linear or interaction effects in the imputation model for each incomplete variable. However, researchers often use MICE in an automated way (Deng and Lumley, 2021).…”
Section: Micementioning
confidence: 99%
See 2 more Smart Citations
“…MICE with default settings (van Buuren and Groothuis Oudshoorn, 2011) would produce unsatisfactory results unless users manually specify any potential non-linear or interaction effects in the imputation model for each incomplete variable. However, researchers often use MICE in an automated way (Deng and Lumley, 2021).…”
Section: Micementioning
confidence: 99%
“…Nonparametric imputation can avoid selecting a distribution by using machine learning. missForest (Stekhoven and Bűhlmann, 2012) uses a random forest and mixgb (Deng and Lumley, 2021) is based on XGBoost for the imputation.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…This is also true for IRMI [26] and imputeRobust [24] from R package VIM. Imputation with random forests from R package ranger [16] and especially imputation with XG-Boost using the R package mixgb [9] are outperformed by GAM methods. GAMLSS with normal distribution (NO) performs better than with assumed t-distribution (TF).…”
Section: Visual Comparison Of a Single Imputationmentioning
confidence: 99%
“…Extreme gradient boosting (XGBoost) can also be used to use bootstrapping and predictive mean matching for the imputation of missing data [19]. When used under fully conditional specification (FCS), XGBoost imputation models are developed for each incomplete parameter.…”
Section: Extreme Gradient Boostingmentioning
confidence: 99%