2018
DOI: 10.1016/j.ins.2017.12.023
|View full text |Cite
|
Sign up to set email alerts
|

EMDID: Evolutionary multi-objective discretization for imbalanced datasets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 29 publications
(4 citation statements)
references
References 35 publications
0
4
0
Order By: Relevance
“…Secondly, experimental evidence supports that discretization can improve naive Bayes performance (Mubaroq et al, 2019). Tahan and Asadi (2018) also highlight that discretization can mitigate the impact of class imbalance on supervised learning. This is interesting because the class imbalance is a common condition in bioinformatics (Zhang et al, 2020).…”
Section: Background Reviewmentioning
confidence: 94%
See 1 more Smart Citation
“…Secondly, experimental evidence supports that discretization can improve naive Bayes performance (Mubaroq et al, 2019). Tahan and Asadi (2018) also highlight that discretization can mitigate the impact of class imbalance on supervised learning. This is interesting because the class imbalance is a common condition in bioinformatics (Zhang et al, 2020).…”
Section: Background Reviewmentioning
confidence: 94%
“…Naive Bayes addresses this prerequisite because this is a common preprocessing step in its application (Yang and Webb, 2009). Additionally, as noted in Tahan and Asadi (2018), discretization can mitigate the impact of class imbalance on supervised learning. That is a common condition in bioinformatics (Zhang et al, 2020).…”
Section: Introductionmentioning
confidence: 99%
“…The genetic algorithm has inherent implicit parallelism and strong global search ability, and has achieved promising results on the problem of feature discretization 47 . The genetic algorithm uses binary coding to encode the candidate breakpoints.…”
Section: Related Workmentioning
confidence: 99%
“…ML algorithms have been applied in flood susceptibility mapping [15][16][17][18][19][20][21][22][23][24][25][26][27][28][29][30][31], rainfall-runoff modeling [32,33], reservoir inflow forecasting [34,35], stream flow prediction [36,37], suspended sediment estimation [38,39] and the estimation of daily reference evapotranspiration [40,41]. Bayes-based algorithms, such as Bayesian logistic regression (BLR) and decision tree algorithms, such as random forest (RF), alternating decision tree (ADT), logistic model trees (LMT), naïve Bayes tree (NBT), reduced error pruning tree (REPTree) and classification and regression trees (CARTs), have been applied in water resource issues, especially in flood susceptibility mapping [18,[42][43][44][45]. Khosravi et al [46] developed a hybrid algorithm of bagging-decision tree for bed load transport rate prediction.…”
Section: Introductionmentioning
confidence: 99%