2006
DOI: 10.1007/11677482_27
|View full text |Cite
|
Sign up to set email alerts
|

Variational Bayesian Methods for Audio Indexing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
31
0
1

Year Published

2011
2011
2015
2015

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 22 publications
(32 citation statements)
references
References 5 publications
0
31
0
1
Order By: Relevance
“…It enables an inference problem to be converted to an optimisation problem by approximating the intractable distribution with a tractable approximation obtained by minimising the Kullback-Leibler divergence between them. In [23] a Variational Bayes-EM algorithm is used to learn a GMM speaker model and optimize a change detection process and the merging criterion. In [24] Variational Bayes is combined successfully with eigenvoice modeling, described in [25], for the speaker diarization of telephone conversations.…”
Section: ) Bottom-up Approachmentioning
confidence: 99%
“…It enables an inference problem to be converted to an optimisation problem by approximating the intractable distribution with a tractable approximation obtained by minimising the Kullback-Leibler divergence between them. In [23] a Variational Bayes-EM algorithm is used to learn a GMM speaker model and optimize a change detection process and the merging criterion. In [24] Variational Bayes is combined successfully with eigenvoice modeling, described in [25], for the speaker diarization of telephone conversations.…”
Section: ) Bottom-up Approachmentioning
confidence: 99%
“…There exists a large amount of previous work on the diarization problem, much of which is reviewed in [1]- [3]. Because of its relative simplicity, the Bayesian Information Criterion (BIC) has served as a backbone and an inspiration for the development of a number of initial approaches involving speaker change detection and bottom-up hierarchical clustering [4], [5]. Bottom-up approaches in general, where a number of clusters or models are trained and successively merged until only one remains for each speaker, are easily the most popular in the community and consistently tend to achieve the state-of-the-art [6], [7].…”
Section: Introductionmentioning
confidence: 99%
“…marginal likelihoods, posterior probabilities, predictive densities) by bounding the marginal likelihood of the model from below. The use of VB is SD has been pioneered by F. Valente (Valente, 2005) and has been refined by P. Kenny et al (Kenny et al, 2010) by applying it to i-vectors. We should emphasize that VB is a general purpose (approximate) inference method and its use is not limited to finite mixture models.…”
Section: Methods Based On Variational Bayes Approximate Inferencementioning
confidence: 99%
“…Each supervector is then projected onto a space of lower dimensionality and VB inference in adopted to estimate the number of speakers and the assignment of segments to speakers. VB methods that do not make use of the supervector representation can be found in (Valente, 2005),…”
Section: A Variational Bayes Approach To Speaker Diarization Using Sumentioning
confidence: 99%
See 1 more Smart Citation