2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2012
DOI: 10.1109/icassp.2012.6289001
|View full text |Cite
|
Sign up to set email alerts
|

Lower and upper bounds for approximation of the Kullback-Leibler divergence between Gaussian Mixture Models

Abstract: Many speech technology systems rely on Gaussian Mixture Models (GMMs). The need for a comparison between two GMMs arises in applications such as speaker verification, model selection or parameter estimation. For this purpose, the Kullback-Leibler (KL) divergence is often used. However, since there is no closed form expression to compute it, it can only be approximated. We propose lower and upper bounds for the KL divergence, which lead to a new approximation and interesting insights into previously proposed ap… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
61
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 61 publications
(62 citation statements)
references
References 5 publications
1
61
0
Order By: Relevance
“…While our work formulates the KL divergence with the same HMM definitions as [7], and extends the variational approximation to the case of HMMs with Gaussian mixture models (GMMs) modelling the observation probabilities at states. Moreover, our work can be considered as the expansions of the methods for approximating the KL divergence between GMMs in [8] and [9]. We finally confirm the effectiveness of the proposed methods by experiments.…”
Section: Introductionsupporting
confidence: 55%
See 1 more Smart Citation
“…While our work formulates the KL divergence with the same HMM definitions as [7], and extends the variational approximation to the case of HMMs with Gaussian mixture models (GMMs) modelling the observation probabilities at states. Moreover, our work can be considered as the expansions of the methods for approximating the KL divergence between GMMs in [8] and [9]. We finally confirm the effectiveness of the proposed methods by experiments.…”
Section: Introductionsupporting
confidence: 55%
“…In [9], an approximation of the KL divergence between GMMs are adopted based on the idea that strict bounds can provide an interval in which the real value of the KL divergence can be found. Motivated by this idea, we also design an approximation for the divergence between HMMs based on bounds.…”
Section: Approximation Based On Bounds For the Kl Divergencementioning
confidence: 99%
“…Similarly, in the case of Kullback-Leibler divergence, an analytic evaluation of the differential entropy is also impossible. Thus, approximate calculations become inevitable [7][8][9]. Jenssen et al [3] use the Rényi entropy [10] as a similarity measure between clusters.…”
Section: Introductionmentioning
confidence: 99%
“…The dissimilarity arises in the condition in (47) for Lemma 5, which is different from the condition in (41) for Lemma 4. This changes the range for the minimax risk ε * in which the lower bound in (45) holds.…”
Section: A Sparse Gaussian Coefficientsmentioning
confidence: 99%