2002
DOI: 10.1109/tpami.2002.1114861
|View full text |Cite
|
Sign up to set email alerts
|

Input feature selection by mutual information based on Parzen window

Abstract: Mutual information is a good indicator of relevance between variables, and have been used as a measure in several feature selection algorithms. However, calculating the mutual information is difficult, and the performance of a feature selection algorithm depends on the accuracy of the mutual information. In this paper, we propose a new method of calculating mutual information between input and class variables based on the Parzen window, and we apply this to a feature selection algorithm for classification prob… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
329
0
3

Year Published

2006
2006
2023
2023

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 578 publications
(332 citation statements)
references
References 11 publications
0
329
0
3
Order By: Relevance
“…The starting values were fixed at: h 1,0 = h 2,0 = 1 2 log(N ) , as in [11], and the lower and upper bounds at…”
Section: Resultsmentioning
confidence: 99%
“…The starting values were fixed at: h 1,0 = h 2,0 = 1 2 log(N ) , as in [11], and the lower and upper bounds at…”
Section: Resultsmentioning
confidence: 99%
“…Hence, an estimation of the mutual information is required and different methods can be employed. Among the possible methods are histogram-based [25], kernel density estimation [26], knearest neighbour [27], Parzen window [28], B-spline [29], adaptive partitioning [30,31] and fuzzy-based [32] approaches. These estimation methods typically involve some pre-set parameters whose optimal values heavily depend on problem characteristics.…”
Section: Related Workmentioning
confidence: 99%
“…Mutual information, which measures the mutual dependence of two variables, has historically been used successfully in the related area of feature selection, such as in [9], [10], and [11]. Typical feature selection algorithms involving mutual information select features with high mutual information with respect to a classification label or class, such that the features are relevant to the classification task.…”
Section: Mutual Information As Relevance Measurementioning
confidence: 99%
“…The technique described in [9] for computing the mutual information via approximation of equation 1 is used here. The Parzen windowing technique is used to model the distribution of the continuous-valued feature vectors:…”
Section: Mutual Information As Relevance Measurementioning
confidence: 99%