2022
DOI: 10.1080/10543406.2022.2089160
|View full text |Cite
|
Sign up to set email alerts
|

Bayesian Additive Regression Trees (BART) with covariate adjusted borrowing in subgroup analyses

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(3 citation statements)
references
References 26 publications
0
3
0
Order By: Relevance
“…Other design methods incorporate information borrowing by pooling all tumor types if deemed homogeneous at the interim analysis ( 53 ) or by pooling within subgroups that are defined using a clustering algorithm ( 54 ), clustering of tumor types ( 55 58 ), modeling patient-level data with a tree ensemble method ( 59 ), and using Bayesian commensurate priors ( 60 ). Baumann et al.…”
Section: Resultsmentioning
confidence: 99%
“…Other design methods incorporate information borrowing by pooling all tumor types if deemed homogeneous at the interim analysis ( 53 ) or by pooling within subgroups that are defined using a clustering algorithm ( 54 ), clustering of tumor types ( 55 58 ), modeling patient-level data with a tree ensemble method ( 59 ), and using Bayesian commensurate priors ( 60 ). Baumann et al.…”
Section: Resultsmentioning
confidence: 99%
“…BART is an efficient ensemble method for machine learning [78]. Within the past decade its popularity has been increasing and it has been applied in a diverse range of applications, e.g., [79,80]. It functions based on several decision trees in which parameters of the model are regularized in advance [78].…”
Section: Bayesian Additive Regression Trees (Bart)mentioning
confidence: 99%
“…More recently, there have been papers on modeling analysis based on regression trees approach in many fields. Such as the piecewise symbolic regression tree by Zhang et al [17], the boosted regression tree (Knierim et al [18], Said et al [19], Han et al [20], and Alnahit et al [21]), the logistic regression trees by Loh [22], the bayesian additive regression tree (Pan et al [23], Clark et al [24], Um et al [25]) and so on.…”
Section: Introductionmentioning
confidence: 99%