2018
DOI: 10.1186/s12918-018-0624-4
|View full text |Cite
|
Sign up to set email alerts
|

An interpretable boosting model to predict side effects of analgesics for osteoarthritis

Abstract: BackgroundOsteoarthritis (OA) is the most common disease of arthritis. Analgesics are widely used in the treat of arthritis, which may increase the risk of cardiovascular diseases by 20% to 50% overall.There are few studies on the side effects of OA medication, especially the risk prediction models on side effects of analgesics. In addition, most prediction models do not provide clinically useful interpretable rules to explain the reasoning process behind their predictions. In order to assist OA patients, we u… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
45
0
1

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2
1

Relationship

1
8

Authors

Journals

citations
Cited by 52 publications
(48 citation statements)
references
References 33 publications
2
45
0
1
Order By: Relevance
“…22,[36][37][38] In addition, although overfitting is a common limitation in refined non-linear machine learning algorithms, XGBoost supervises machine learning problems by parallel computing, regularization, cross validation, flexibility, or availability. 16,39,40 Comparison with previous research DeVries and associates previously reported that no clinically significant differences were observed between the use of unsupervised machine learning with complete admission neurological information and established standards. 10 They showed the inherent weakness of applying AUC to imbalanced data sets and outlined a new strategy to evaluate performance.…”
Section: Figmentioning
confidence: 75%
See 1 more Smart Citation
“…22,[36][37][38] In addition, although overfitting is a common limitation in refined non-linear machine learning algorithms, XGBoost supervises machine learning problems by parallel computing, regularization, cross validation, flexibility, or availability. 16,39,40 Comparison with previous research DeVries and associates previously reported that no clinically significant differences were observed between the use of unsupervised machine learning with complete admission neurological information and established standards. 10 They showed the inherent weakness of applying AUC to imbalanced data sets and outlined a new strategy to evaluate performance.…”
Section: Figmentioning
confidence: 75%
“…[10][11][12][13][14] Among different machine learning systems, extreme gradient boosting (XGBoost) is widely used to accomplish state-of-the-art analyses in diverse fields with good accuracy or area under the receiver operating characteristic curve (AUC). 15,16 XGBoost, a decision-tree-based ensemble machine learning algorithm with a gradient boosting framework, was developed by Chen and Guestrin. 17 It has since been used in traffic census and the field of energy consumption.…”
Section: Introductionmentioning
confidence: 99%
“…When adding new models, a gradient descent algorithm was used to minimize the loss. The XGBoost model was widely used for diagnosis classi cation [34,35], treatment effect [36,37], and prognosis evaluation [38,39] in different diseases.…”
Section: Discussionmentioning
confidence: 99%
“…Using the R package Caret ( CRAN: Caret ) 59 , we developed three (3) models with inherent five (5)-fold cross validation (CV), such that we obtained test set accuracy on running each model. These models included Random Forest 60 , Extreme Gradient Boosting 61 , and Gradient Boosting Machine (GBM) 62 . Testing these models with five-pronged, FDA-adherent teratogenicity scores, we found that GBM yielded the highest predictive accuracy.…”
Section: Methodsmentioning
confidence: 99%