2019
DOI: 10.1101/856385
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Hierachical Resampling for Bagging in Multi-Study Prediction with Applications to Human Neurochemical Sensing

Abstract: Prediction settings with multiple studies have become increasingly common. Ensembling models trained on individual studies has been shown to improve replicability in new studies. Motivated by a groundbreaking new technology in human neuroscience, we introduce two generalizations of multi-study ensemble predictions. First, while existing methods weight ensemble elements by cross-study prediction performance, we extend weighting schemes to also incorporate covariate similarity between training data and target va… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 44 publications
0
2
0
Order By: Relevance
“…10,13 The domain generalization literature focuses on leveraging multiple datasets in model training to improve model generalizability and enhance prediction performance on a new, unknown, but related, "domain". 14 Our methods are inspired both by this vast literature in transfer learning as well as related methods in "multistudy" statistics that draw upon multiple data sources in inference, 15,16 supervised prediction, 7,17,18 and unsupervised learning. 19 Previous work in this area proposed multistudy ensembling in conjunction with a generalization of stacking, an ensemble weight estimation method, 20 as a flexible strategy to aggregate information from different studies.…”
Section: Related Literaturementioning
confidence: 99%
See 1 more Smart Citation
“…10,13 The domain generalization literature focuses on leveraging multiple datasets in model training to improve model generalizability and enhance prediction performance on a new, unknown, but related, "domain". 14 Our methods are inspired both by this vast literature in transfer learning as well as related methods in "multistudy" statistics that draw upon multiple data sources in inference, 15,16 supervised prediction, 7,17,18 and unsupervised learning. 19 Previous work in this area proposed multistudy ensembling in conjunction with a generalization of stacking, an ensemble weight estimation method, 20 as a flexible strategy to aggregate information from different studies.…”
Section: Related Literaturementioning
confidence: 99%
“…19 Previous work in this area proposed multistudy ensembling in conjunction with a generalization of stacking, an ensemble weight estimation method, 20 as a flexible strategy to aggregate information from different studies. 6,7,17,18 The approach involves two separate stages: (A) training one or more models on each study separately, and (B) constructing an ensemble prediction rule that is a weighted average of the predictions from each of the study-specific models. The ensemble weights are estimated in step B through "multistudy stacking," (MSS) by regressing the outcome of all training studies against the predictions of each of these models, using all available training data sets.…”
Section: Related Literaturementioning
confidence: 99%