2018
DOI: 10.1109/access.2017.2788700
|View full text |Cite
|
Sign up to set email alerts
|

Construction of Near-Optimal Axis-Parallel Decision Trees Using a Differential-Evolution-Based Approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
24
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 40 publications
(24 citation statements)
references
References 64 publications
0
24
0
Order By: Relevance
“…The differential evolution algorithm is a powerful and effective evolutionary algorithm firstly proposed by Storn and Price [32]. It is an algorithm that can be implemented easily with very few parameters [33]. It has been proven to be superior to the algorithms such as the genetic algorithm, evolution strategy, adaptive simulated annealing [32], and particle swarm optimization [34], [35].…”
Section: B Differential Evolution Algorithmmentioning
confidence: 99%
“…The differential evolution algorithm is a powerful and effective evolutionary algorithm firstly proposed by Storn and Price [32]. It is an algorithm that can be implemented easily with very few parameters [33]. It has been proven to be superior to the algorithms such as the genetic algorithm, evolution strategy, adaptive simulated annealing [32], and particle swarm optimization [34], [35].…”
Section: B Differential Evolution Algorithmmentioning
confidence: 99%
“…To cluster categorical data, we use formula (13) to replace formula (7) in step 4 and step 7 of Algorithm 1 and formula (15) to formula (10) in step 6.…”
Section: Categorical Featurementioning
confidence: 99%
“…Despite practical success, the optimal construction of decision trees has been theoretically proven to be NPcomplete [11]. In order to avoid the local optimal solution, some researchers adopted evolutionary algorithms to build decision trees [12][13][14]. However, due to the time complexity, the most popular algorithms, such as ID3 [15], C4.5 [16], and CART [17], and their various modifications [18] are greedy by nature and construct the decision tree in a top-down, recursive manner.…”
Section: Introductionmentioning
confidence: 99%
“…The starting node is called a root node, the internal nodes are the set of nodes Child1 to Child3, bottom nodes are class labels or leaf nodes. [24] In the DT model, to construct a reasonably good tree and to define attributes for each root note, Grini impurity (cost function) is given as follows: [24].…”
Section: B: Decision Treesmentioning
confidence: 99%