2020
DOI: 10.18178/ijmlc.2020.10.3.955
|View full text |Cite
|
Sign up to set email alerts
|

Decision Tree Algorithm with Class Overlapping-Balancing Entropy for Class Imbalanced Problem

Abstract: The problem of handling a class imbalanced problem by modifying decision tree algorithm has received widespread attention in recent years. A new splitting measure, called class overlapping-balancing entropy (OBE), is introduced in this paper that essentially pay attentions to all classes equally. Each step, the proportion of each class is balanced via the assigned weighted values. They not only depend on equalizing each class, but they also take into account the overlapping region between classes. The proporti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 9 publications
(3 citation statements)
references
References 22 publications
0
3
0
Order By: Relevance
“…Reference [40] derives weight from the class distribution to improve the minority class importance. Reference [41] introduces the overlapping weighting function to improve minority class importance. However, heuristic judgments, i.e., determined by experts is not rigorous, and grid search is time-consuming and laborious.…”
Section: Datasets Descriptionmentioning
confidence: 99%
“…Reference [40] derives weight from the class distribution to improve the minority class importance. Reference [41] introduces the overlapping weighting function to improve minority class importance. However, heuristic judgments, i.e., determined by experts is not rigorous, and grid search is time-consuming and laborious.…”
Section: Datasets Descriptionmentioning
confidence: 99%
“…Many techniques in this approach are an oversampling technique [9,10,11] which synthesizes random instances from the minority group avoiding those from majority groups, or an undersampling technique [12,13] which discards random instances from the majority group to extend the minority region of instances in the minority class or the mixture of an oversampling and undersampling technique [14]. Second, an algorithmic-level methodology upgrades or reimplements the classification algorithms to be more robust to noise while handling minority instances successfully [15,16,17,18,19]. Third, the hybrid methodology combines both the data-level approach and the algorithmic-level approach such as Adaboost [20], Boosting [21], Bagging [22], etc.…”
Section: Introductionmentioning
confidence: 99%
“…There are many techniques in this approach such as an oversampling technique [12,13,14] which synthesises instances randomly from the minority class while ignore those from majority class, or an undersampling technique [15,16] which randomly removed instances from the majority class to expand the minority region of instances in the minority class or the combines of both an oversampling and undersampling technique [17]. Second, an algorithmic-level methodology enhances or reimplements the classification algorithms to be more resilient to noise while successfully handling minority instances [18,19,20,21,22]. An algorithmic-level methodology efficiently deals with the class imbalanced problem using the original data that is available without any modification.…”
Section: Techniques To Solve a Class Imbalance Problemmentioning
confidence: 99%