2019
DOI: 10.1109/jas.2019.1911447
|View full text |Cite
|
Sign up to set email alerts
|

An embedded feature selection method for imbalanced data classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
101
0
4

Year Published

2020
2020
2023
2023

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 313 publications
(105 citation statements)
references
References 34 publications
0
101
0
4
Order By: Relevance
“…However, the performance of F-measure achieved excellent performance only if 20% or more of features were chosen. The results were helpful for practitioners to select a proper feature selection method when facing a practical problem [18].…”
Section: Table Imentioning
confidence: 97%
“…However, the performance of F-measure achieved excellent performance only if 20% or more of features were chosen. The results were helpful for practitioners to select a proper feature selection method when facing a practical problem [18].…”
Section: Table Imentioning
confidence: 97%
“…Compared with the state-of-theart algorithms, the proposed algorithm could acquire better classification performance on the 27 imbalanced data sets. Liu et al [25] developed an embedded feature selection algorithm named weighted Gini index by adding an index-weighting method to classification and regression tree to deal with the class imbalanced problem. Experiments shown that WGI could achieve the better performance only if 20% or more of features are chosen compared to Chi2, F-statistic and Gini index.…”
Section: Related Work a Cost-sensitive Learning Algorithmmentioning
confidence: 99%
“…A. DATA PREPROCESSING An effective feature selection method is critical to the preprocessing of a dataset [51]. A dataset of historical resource demands includes a large number of containers of various types.…”
Section: An Adaptive Prediction Algorithm For Cloud Resource Demandsmentioning
confidence: 99%