2016
DOI: 10.1007/s10044-016-0533-3
|View full text |Cite
|
Sign up to set email alerts
|

Decision tree induction based on minority entropy for the class imbalance problem

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
28
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 41 publications
(28 citation statements)
references
References 15 publications
0
28
0
Order By: Relevance
“…For , , i =2, 3, 4372, 4373, all nearest neighbors whose size up to 13 belong to one class and the neighborhood of different class appears when the nearest neighbor size becomes 15. This results in 0 for k=1, 3,5,7,9,11,13 and 0. For , , i =4, 5, …, 9, 4366, 4367, …, 4371, all nearest neighbors whose size up to 11 belong to one class and the neighbors from different class are shown when the nearest neighbor size is over 11.…”
Section: Graphical Analysis Of Nearest Neighbors Entropymentioning
confidence: 99%
See 2 more Smart Citations
“…For , , i =2, 3, 4372, 4373, all nearest neighbors whose size up to 13 belong to one class and the neighborhood of different class appears when the nearest neighbor size becomes 15. This results in 0 for k=1, 3,5,7,9,11,13 and 0. For , , i =4, 5, …, 9, 4366, 4367, …, 4371, all nearest neighbors whose size up to 11 belong to one class and the neighbors from different class are shown when the nearest neighbor size is over 11.…”
Section: Graphical Analysis Of Nearest Neighbors Entropymentioning
confidence: 99%
“…For , , i =4, 5, …, 9, 4366, 4367, …, 4371, all nearest neighbors whose size up to 11 belong to one class and the neighbors from different class are shown when the nearest neighbor size is over 11. This results in 0 for k=1, 3,5,7,9,11, and 0 for k=13, 15. We classify , based on its number of nonzero entropies, , in Figure 3.…”
Section: Graphical Analysis Of Nearest Neighbors Entropymentioning
confidence: 99%
See 1 more Smart Citation
“…The core measures of entropy minimization based discretization include information entropy and gain [24,25]. Let k classes be 12 , ,..., k C C C in samples set S and let ( , ) i P C S be the proportion of samples in S that has class i C .…”
Section: Data Discretizationmentioning
confidence: 99%
“…The core measures of entropy minimization based discretization include information entropy and gain [24,25]. Let k classes be C 1 , C 2 , .…”
Section: Data Discretizationmentioning
confidence: 99%