2017
DOI: 10.1007/s00500-016-2475-5
|View full text |Cite
|
Sign up to set email alerts
|

MEMOD: a novel multivariate evolutionary multi-objective discretization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
3

Relationship

0
9

Authors

Journals

citations
Cited by 21 publications
(9 citation statements)
references
References 56 publications
0
9
0
Order By: Relevance
“…Reference [ 38 ] proposed an evolutionary approach, which obtains a set of discretization schemes guiding the search by using a discretization criterion and the prediction accuracy of Naive Bayes. In Reference [ 39 ], classification error and number of cut points are simultaneously reduced by using evolutionary multi-objective optimization.…”
Section: Backgroundsmentioning
confidence: 99%
“…Reference [ 38 ] proposed an evolutionary approach, which obtains a set of discretization schemes guiding the search by using a discretization criterion and the prediction accuracy of Naive Bayes. In Reference [ 39 ], classification error and number of cut points are simultaneously reduced by using evolutionary multi-objective optimization.…”
Section: Backgroundsmentioning
confidence: 99%
“…To discretize the data ets with reference discretization methods, KEEL tool (http://sci2s.ugr.es/keel/download.php) is used. During our literature review, we observed that several studies report classification accuracy of their proposed methods using C4.5 and naive Bayes classifiers [5,12,[17][18][19][20]. For this reason, in this study, predictive accuracy of lFIT is also evaluated using these two classifiers of the Weka tool (https://www.cs.waikato.ac.nz/ml/weka/).…”
Section: Datasets and Experimental Settingsmentioning
confidence: 99%
“…To evaluate the performance of the proposed method, ten benchmark classification datasets are discretized with two unsupervised discretization methods, namely equal-width (EW), equal-frequency (EF), six supervised discretization methods, namely 1R [11], class-attribute contingency coefficient (CACC) [12], class-attribute interdependence maximization (CAIM) [13], MODL [14], MDLP [15], Hellinger [16], and the proposed method and the number of inconsistency and the number of cut-points the methods generated are compared. To evaluate predictive accuracy, similar to [5,12,[17][18][19][20], C4.5 and naive Bayes classifiers are trained with each discretized scheme and the predictive accuracy of the methods are compared. The experimental results show that lFIT ranks among top methods in predictive accuracy and generates schemes with low number of inconsistency and larger number of intervals.…”
Section: Introductionmentioning
confidence: 99%
“…ML algorithms have been applied in flood susceptibility mapping [15][16][17][18][19][20][21][22][23][24][25][26][27][28][29][30][31], rainfall-runoff modeling [32,33], reservoir inflow forecasting [34,35], stream flow prediction [36,37], suspended sediment estimation [38,39] and the estimation of daily reference evapotranspiration [40,41]. Bayes-based algorithms, such as Bayesian logistic regression (BLR) and decision tree algorithms, such as random forest (RF), alternating decision tree (ADT), logistic model trees (LMT), naïve Bayes tree (NBT), reduced error pruning tree (REPTree) and classification and regression trees (CARTs), have been applied in water resource issues, especially in flood susceptibility mapping [18,[42][43][44][45]. Khosravi et al [46] developed a hybrid algorithm of bagging-decision tree for bed load transport rate prediction.…”
Section: Introductionmentioning
confidence: 99%