“…The gradient boosting was found to provide the best accuracy which is 74.14% when compared to logistic regression, decision tree, random forest, k-nearest neighbor, SVM and naïve Bayes [9]. However, it was found that the highest accuracy belongs to random forest when compared to logistic regression, decision tree, k-nearest neighbor, naïve Bayes, SVM, and neural network [10]. Instead of using individual technique, bagging is used for the classification [11].…”