2020 5th International Conference on Information Technology Research (ICITR) 2020
DOI: 10.1109/icitr51448.2020.9310892
|View full text |Cite
|
Sign up to set email alerts
|

Prediction of diabetes using cost sensitive learning and oversampling techniques on Bangladeshi and Indian female patients

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 20 publications
0
2
0
Order By: Relevance
“…There were many studies that proposed the implementation of oversampling methods for the prediction of heart disease, diabetes and obesity using the most common approach -the Synthetic Minority Oversampling Technique (SMOTE) [10,12,18,19]. SMOTE is a famous approach used for the construction of a classifier for the imbalance dataset.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…There were many studies that proposed the implementation of oversampling methods for the prediction of heart disease, diabetes and obesity using the most common approach -the Synthetic Minority Oversampling Technique (SMOTE) [10,12,18,19]. SMOTE is a famous approach used for the construction of a classifier for the imbalance dataset.…”
Section: Related Workmentioning
confidence: 99%
“…The classifier showed that J48 Decision Tree had higher accuracy ratio in comparison with Naïve Bayes (NB). The evaluation metrics commonly used for the imbalanced classification include F1 score, accuracy, precision, recall and AUC [12,23].…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The idea is to convert a cost in-sensitive learner to a cost-sensitive learner by increasing the number of costly class examples and reducing the number of non-costly classes. As a result, increasing the frequency of costly classes will increase its weight, which ultimately reflects its importance in the learning process, this technique includes: Over-sampling : includes increasing the number of costly class examples (Pranto et al , 2020; Le et al , 2018; Devi et al , 2021; Zhang et al , 2017). Under sampling: includes reducing the number of less costly class examples (Ma et al , 2017; Tyagi and Mittal, 2020).…”
Section: Related Workmentioning
confidence: 99%
“…Over-sampling : includes increasing the number of costly class examples (Pranto et al , 2020; Le et al , 2018; Devi et al , 2021; Zhang et al , 2017).…”
Section: Related Workmentioning
confidence: 99%