“…Techniques such as data imputation for handling missing values, detection, and replacement of outliers using the Boxplot method have been used to achieve better results than other related works. Boosting, bagging, stacking, and majority vote Cleveland heart disease dataset (publicly available) 85.48% with majority vote [10] Recursive feature elimination and GB Do 89.78% [12] XGB with Bayesian optimization Do 91.80% [14] CatBoost, GB, XGB, and ADB Do 83.60% with ADB [17] DNN, KDNN, XGB, KNN, decision tree, and random forest Do 88.65% with random forest [18] Naïve Bayes, linear model, logistic regression, decision tree, random forest, SVM, and HRFLM Do 88.40% with HRFLM Our method XGB, ADB, and GB Do 92.20% for BDT…”