2021
DOI: 10.1016/j.knosys.2021.106901
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble learning-based filter-centric hybrid feature selection framework for high-dimensional imbalanced data

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 28 publications
(10 citation statements)
references
References 28 publications
0
9
0
Order By: Relevance
“…Feature selection is commonly considered as a necessary preprocessing step for classification tasks in the context of class imbalance [20][21][22][23] to construct an optimal feature subset. The commonly used feature selection methods can be mainly categorized into three types: Filter, Wrapper and Embedded [24].…”
Section: Introductionmentioning
confidence: 99%
“…Feature selection is commonly considered as a necessary preprocessing step for classification tasks in the context of class imbalance [20][21][22][23] to construct an optimal feature subset. The commonly used feature selection methods can be mainly categorized into three types: Filter, Wrapper and Embedded [24].…”
Section: Introductionmentioning
confidence: 99%
“…As for dimension reduction, lots of methods have been proposed so far. However, some are applied to domain-specific [ 5 , 6 , 7 , 8 ], some are aimed at solving high-dimension problems [ 29 ], some are to solve the problem of unbalanced samples, some are focused on the relationship between the features [ 13 , 30 ], and some are adopted heuristic algorithms. They are limited to a particular field or condition.…”
Section: Introductionmentioning
confidence: 99%
“…Feature selection aims to select the most applicable features while ignoring the irrelevant and redundant ones, and feature selection methods are generally divided into three types: filter, wrapper and embedded [ 6 , 10 , 11 , 20 , 29 , 30 ].…”
Section: Introductionmentioning
confidence: 99%
“…Many literature studies show that feature selection methods based on metaheuristics have excellent performance in solving common feature selection problems. However, with the expansion of search space, especially when the number of features reaches thousands, its calculation cost will increase exponentially [ 6 ].…”
Section: Introductionmentioning
confidence: 99%