2023
DOI: 10.1016/j.engappai.2023.106080
|View full text |Cite
|
Sign up to set email alerts
|

Glee: A granularity filter for feature selection

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9

Relationship

1
8

Authors

Journals

citations
Cited by 13 publications
(5 citation statements)
references
References 54 publications
0
1
0
Order By: Relevance
“…Ba et al [23] present Glee, a novel Granular Computing (GrC) based framework for efficient and effective feature selection. Glee calculates the granularity value for each feature, reorders them accordingly, and adds features to the selection pool one by one until a termination condition is met.…”
Section: Related Workmentioning
confidence: 99%
“…Ba et al [23] present Glee, a novel Granular Computing (GrC) based framework for efficient and effective feature selection. Glee calculates the granularity value for each feature, reorders them accordingly, and adds features to the selection pool one by one until a termination condition is met.…”
Section: Related Workmentioning
confidence: 99%
“…Generally speaking, we need to evaluate the significance of attributes in AT, eliminate low-quality attributes from the reduct pool, and select qualified attributes. Based on the greedy searching for attribute reduction [44][45][46], Definition 15 provides a significance attribute about our proposed quality-to-entropy ratio.…”
Section: Theorem 1 For a Given Decision System Ds A Radiusmentioning
confidence: 99%
“…The main target of feature selection is obtaining a representative feature combination from the original feature set to maximize the OA [41], which is an important evaluation criterion, but how to decrease the number of selected features is also a crucial target in feature selection. In this paper, the objective function is used to evaluate the feature combination searched by agents [42]; it is described in Equation (7).…”
Section: The Objective Functionmentioning
confidence: 99%
“…Feature extraction involves the linear or nonlinear transformation of the original high-dimensional features, such as combining different features into a new feature set [6], where the features lose their original physical meaning. Feature selection involves selecting the most representative feature combination from the dataset; it detects representative features and decreases redundant information or noise from data, which improves classification accuracy and enhances comprehensibility [7]. Due to the difficulty in interpreting selected features from feature extraction, feature selection is widely used in the processing of HSI datasets.…”
Section: Introductionmentioning
confidence: 99%