2019
DOI: 10.35940/ijeat.f8684.088619
|View full text |Cite
|
Sign up to set email alerts
|

Accurate Liver Disease Prediction with Extreme Gradient Boosting

Abstract: Machine learning is used extensively in medical diagnosis to predict the existence of diseases. Existing classification algorithms are frequently used for automatic detection of diseases. But most of the times, they do not give 100% accurate results. Boosting techniques are often used in Machine learning to get maximum classification accuracy. Though several boosting techniques are in place but the XGBoost algorithm is doing extremely well for some selected data sets. Building an XGBoost model is simple but im… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(2 citation statements)
references
References 14 publications
0
2
0
Order By: Relevance
“…The accuracies of these modelling techniques were found to be above 85% and majority of them were 90% percent and above. One of the papers by Murty and Kumar (2019) showed that with the optimization of L2 regularization, Logistic loss function, learning rate and the number of estimators that are used in the model development gave them an accuracy of 99% which is the highest recorded in the previously done studies. Budoliya, Shrivastava and Sharma (2020) used the Bayesian Optimization on their model and managed to get an 85% accuracy on the training dataset and a 91.8% accuracy on the testing dataset.…”
Section: Extreme Gradient Boosting (Xgboost)mentioning
confidence: 93%
“…The accuracies of these modelling techniques were found to be above 85% and majority of them were 90% percent and above. One of the papers by Murty and Kumar (2019) showed that with the optimization of L2 regularization, Logistic loss function, learning rate and the number of estimators that are used in the model development gave them an accuracy of 99% which is the highest recorded in the previously done studies. Budoliya, Shrivastava and Sharma (2020) used the Bayesian Optimization on their model and managed to get an 85% accuracy on the training dataset and a 91.8% accuracy on the testing dataset.…”
Section: Extreme Gradient Boosting (Xgboost)mentioning
confidence: 93%
“…A back-propagation network (BPN) combined with a multilayer feed-forward deep neural network (MLFFDNN) was utilized by [22]. XGBoost was used to estimate liver disease data, and the authors used L1 and L2 [23] during their work. An imbalance in ILPD was handled through a specific method called the minority oversampling algorithm.…”
Section: Related Workmentioning
confidence: 99%