2014
DOI: 10.1155/2014/479289
|View full text |Cite
|
Sign up to set email alerts
|

Feature Selection with Neighborhood Entropy-Based Cooperative Game Theory

Abstract: Feature selection plays an important role in machine learning and data mining. In recent years, various feature measurements have been proposed to select significant features from high-dimensional datasets. However, most traditional feature selection methods will ignore some features which have strong classification ability as a group but are weak as individuals. To deal with this problem, we redefine the redundancy, interdependence, and independence of features by using neighborhood entropy. Then the neighbor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
10
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
6
2

Relationship

1
7

Authors

Journals

citations
Cited by 12 publications
(10 citation statements)
references
References 16 publications
0
10
0
Order By: Relevance
“…For the second stage, the generated features are streamed sequentially into the processing window. After the number of features in the cache window reaches the window size again, the value of the feature repulsion loss function is calculated for all candidate features in the cache window according to (12). Hence, the candidate features with the largest feature repulsion loss (FRL) are saved to the selected feature subset.…”
Section: Streaming Feature Selectionmentioning
confidence: 99%
See 1 more Smart Citation
“…For the second stage, the generated features are streamed sequentially into the processing window. After the number of features in the cache window reaches the window size again, the value of the feature repulsion loss function is calculated for all candidate features in the cache window according to (12). Hence, the candidate features with the largest feature repulsion loss (FRL) are saved to the selected feature subset.…”
Section: Streaming Feature Selectionmentioning
confidence: 99%
“…In this paper, we consider only filtering methods which seek to design effective metrics for evaluating the importance of candidate features. Such metrics can be based on information entropy [11][12][13][14], the classification boundary [15], and rough sets [16,17]. In particular, Lee et al [18] proposed a multivariate mutual information criterion for multi-label feature selection.…”
Section: Introductionmentioning
confidence: 99%
“…We use a forward greedy search algorithm, which is usually more efficient than a standard brute-force exhaustive search [4]. That is, one starts with an empty set of attributes and adds features to the subset of selected attributes one by one.…”
Section: Feature Selection Based On Knrsmentioning
confidence: 99%
“…This theory has been successfully applied to many fields, such as data mining, decision-making, pattern recognition, machine learning, and intelligent control [1][2][3][4]. Kernel rough sets [5] and neighborhood rough sets [6] are two important models in rough set theory.…”
Section: Introductionmentioning
confidence: 99%
“…Paper [ 3 ] used the Banzhaf power index in game theory for evaluating the weight of each feature, while paper [ 21 ] used the Shapley value index in game theory for evaluating the weight of each feature. Reference [ 22 ] presented a novel feature selection approach called Neighborhood Entropy-Based Cooperative Game Theory (NECGT) which was based on information theoretic approaches. The results of the evaluation of some UCI data sets showed that the approach yields better performance compared to classical methods in terms of accuracy.…”
Section: Introductionmentioning
confidence: 99%