2005
DOI: 10.1109/lsp.2005.849495
|View full text |Cite
|
Sign up to set email alerts
|

An information-theoretic perspective on feature selection in speaker recognition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
29
0

Year Published

2006
2006
2017
2017

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 39 publications
(29 citation statements)
references
References 15 publications
0
29
0
Order By: Relevance
“…The initial difference is made from the entropy definitions, such as Shannon entropy in [12,14,25,26], and Rényi entropy in [6,7,15]. The second difference is the selection of bound relations, such as "P e vs. H(X|Y)" in [12,24], "H(X|Y) vs. P e " in [6,7,14,15,20], "P e vs. MI(X; Y)" in [27,28], and "N MI(X; Y) vs. A" in [25], where A is the accuracy rate, MI(X; Y) and N MI(X; Y) are the mutual information and normalized mutual information between variables X and Y, respectively. Another important study is made on the tightness of bounds.…”
Section: Related Workmentioning
confidence: 99%
“…The initial difference is made from the entropy definitions, such as Shannon entropy in [12,14,25,26], and Rényi entropy in [6,7,15]. The second difference is the selection of bound relations, such as "P e vs. H(X|Y)" in [12,24], "H(X|Y) vs. P e " in [6,7,14,15,20], "P e vs. MI(X; Y)" in [27,28], and "N MI(X; Y) vs. A" in [25], where A is the accuracy rate, MI(X; Y) and N MI(X; Y) are the mutual information and normalized mutual information between variables X and Y, respectively. Another important study is made on the tightness of bounds.…”
Section: Related Workmentioning
confidence: 99%
“…Eriksson et al 8 showed that the error rate of a speaker recognition system decreases as mutual information increases. In Eq.…”
Section: I͑c;x͒ = H͑c͒ − H͑c͉x͒ ͑1͒mentioning
confidence: 99%
“…In speaker recognition, some researchers use this measurement to measure or improve speaker recognition accuracy. 8,9 In this paper, we experimentally re-evaluate the speaker discriminative power of each phoneme class using mutual information and then find the optimal phoneme class ratio. We adopt the Nelder-Mead method, which is widely used for nonlinear optimization of multi-dimensional data.…”
Section: Introductionmentioning
confidence: 99%
“…Apparently, according to this formula, the feature subset with high fitness value will be possible to lead to high classification accuracy and less complex classifier. The corresponding fitness function of evaluation criterion 2 is: fitness 3 …”
Section: Consistency and Otherness Of Evaluation Criteriamentioning
confidence: 99%
“…In many pattern classification applications, selection of the most characterizing features (or attributes) of the observed data (such as feature selection or variable selection, among many other names) is important to maximize the classification accuracy [1][2][3][4][5][6][7][8][9]. This is especially important when one Y. Li ( ) · X. Zeng College of Communication Engineering, Chongqing University, Chongqing 400030, China e-mail: lymcentor924924@gmail.com is required to deal with a large or even overwhelming feature set.…”
Section: Introductionmentioning
confidence: 99%