2017
DOI: 10.1007/s11063-017-9703-6
|View full text |Cite
|
Sign up to set email alerts
|

Pruning the Ensemble of ANN Based on Decision Tree Induction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(3 citation statements)
references
References 36 publications
0
3
0
Order By: Relevance
“…To verify the utility of the ECDPB, we compare the proposed method with the following techniques: GASEN [17], DMEP [20], MDOEP [39], IBAFSEN [19], PEAD [40], RCOA [34], and IDAFMEP [16]. The primary goal of these techniques is to enhance the classification capacity in comparison with a bagging ensemble.…”
Section: Resultsmentioning
confidence: 99%
“…To verify the utility of the ECDPB, we compare the proposed method with the following techniques: GASEN [17], DMEP [20], MDOEP [39], IBAFSEN [19], PEAD [40], RCOA [34], and IDAFMEP [16]. The primary goal of these techniques is to enhance the classification capacity in comparison with a bagging ensemble.…”
Section: Resultsmentioning
confidence: 99%
“…To test the classification ability of HEPCBR further, massive experiments were implemented by comparing it with other techniques: bagging [51], kappa [28], AGOB [26], POBE [27], DREP [22], DEELM [39], GASEN [33], GSOEP [34], MOAG [30], RREP [10], DMEP [40], EPSCG [20], DASEP [24], MDOEP [31], RCOA [32], and PEAD [41]. Bagging extracts the training samples with equal probability, and it can construct an initial pool composed of multiple learners with a good diversity.…”
Section: Comparison With Other Methodsmentioning
confidence: 99%
“…Widely used techniques include k-means [35] and deterministic annealing [36]. Finally, many scholars study other different pruning methods to acquire the selection of learners, for example, frequent pattern [7], a randomized greedy selective strategy and ballot [37], greedy randomized dynamic pruning [38], confidence interval based on double-fault measure [39], graph coloring way [40], cost-sensitive rotation forest [17], induction of decision tree [41], and simple coalitional games [20].…”
Section: Introductionmentioning
confidence: 99%