2012
DOI: 10.1016/j.eswa.2011.07.048
|View full text |Cite
|
Sign up to set email alerts
|

Feature subset selection wrapper based on mutual information and rough sets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
47
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 80 publications
(47 citation statements)
references
References 25 publications
0
47
0
Order By: Relevance
“…The rough set construct can be used to identify a reduced version of the original set of attributes pertaining to a decision system. As a consequence, rough sets found considerable application in feature selection and classification systems (Thangavel and Pethalakshmi 2009;Swiniarski and Skowron 2003;Foithong et al 2012).…”
Section: Granular Computing As a General Data Analysis Frameworkmentioning
confidence: 99%
“…The rough set construct can be used to identify a reduced version of the original set of attributes pertaining to a decision system. As a consequence, rough sets found considerable application in feature selection and classification systems (Thangavel and Pethalakshmi 2009;Swiniarski and Skowron 2003;Foithong et al 2012).…”
Section: Granular Computing As a General Data Analysis Frameworkmentioning
confidence: 99%
“…The evaluation is independently performed against different features, and the evaluation result called "feature rank" is directly used to define the usefulness of each feature for classification. Entropy and Mutual information are popular ranking methods to evaluate the relevancy of features [Foithong et al, 2012] [Peng et al, 2005] [Javed et al, 2012] [Estevez et al, 2009]. Zhou et al [Zhou et al, 2011] used the Rényi entropy for feature relevance evaluation of overall inputs in their automatic scaling SVM.…”
Section: Related Workmentioning
confidence: 99%
“…Due to this problem, feature reduction methods are employed in order to alleviate the storage and time requirements of large feature vectors. There are many feature reduction methods, including the linear projection methods such as the Principal Component Analysis (PCA) and the Linear Discriminate Analysis (LDA), and the metric embedding techniques (both linear and non-linear) [4]. However, these methods could not improve the image retrieval performance and efficiently reduce the semantic gap.…”
Section: Introductionmentioning
confidence: 99%
“…These redundant features could influence further analysis in the wrong direction. Consequently, from these significant features, semantic rules are then extracted that can classify the images more accurately and show more relevance images to the user hence improving the retrieval performance [4]. In addition, in this system relevance feedback is used to bridge the semantic gap.…”
Section: Introductionmentioning
confidence: 99%