IEEE International Conference on Acoustics Speech and Signal Processing 2002
DOI: 10.1109/icassp.2002.5743865
|View full text |Cite
|
Sign up to set email alerts
|

On feature extraction by mutual information maximization

Abstract: In order to learn discriminative feature transfonns, we discuss mu tual information between class labels and transformed features as a criterion. Instead of Shannon's definition we use measures based on Renyi entropy, which lends itself into an efficient implementa tion and an interpretation of "information potentials" and "infor mation forces" induced by samples of data. This paper presents two routes towards practical usability of the method, especially aimed to large databases: The first is an on-line stoch… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
101
0
1

Year Published

2006
2006
2018
2018

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 78 publications
(102 citation statements)
references
References 4 publications
0
101
0
1
Order By: Relevance
“…It performed only slightly better than the SIFT method for the NMI evaluation metric. This is in part due to the distinct feature selection criteria bias towards NMI performance, which stems from the removal criteria calibration (Torkkola, 2003). It had the largest performance increase over all other methods for the RMSE CE evaluation metric.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…It performed only slightly better than the SIFT method for the NMI evaluation metric. This is in part due to the distinct feature selection criteria bias towards NMI performance, which stems from the removal criteria calibration (Torkkola, 2003). It had the largest performance increase over all other methods for the RMSE CE evaluation metric.…”
Section: Discussionmentioning
confidence: 99%
“…The mouse boundaries, and are extracted using the marching squares algorithm on the intensity thresholded white light images, once small areas outside the central mouse boundary have been removed (Ho et al , 2005). NMI is a measurement of the mutual information between corresponding images (Wells et al , 1996; Skouson et al , 2001; Pluim et al , 2003; Torkkola, 2003). FSIM evaluates gradient and phase information between corresponding images (Zhang et al , 2011).…”
Section: Methodsmentioning
confidence: 99%
“…Because feature selection occurs outside the bounds of a particular model, quantitative measures need to be put in place to evaluate the set such as the mutual information measure employed by Torkkola [23] or feature value distances between the target class and near-hit/near-miss examples employed by the Relief algorithm [12]. …”
Section: Related Workmentioning
confidence: 99%
“…Other sorts of non-parametric methods include [7,11], which calculate their error rate based criteria by the non-parametrically estimated distribution other than by the supposed normal distribution; and [4,10], which directly define the criteria according to the decision boundary. Here, we do not discuss them in detail, but only focus on the neighbor based methods.…”
Section: Introductionmentioning
confidence: 99%