2015
DOI: 10.1109/lsp.2015.2459059
|View full text |Cite
|
Sign up to set email alerts
|

Quasi-Factorial Prior for i-vector Extraction

Abstract: We analyze the i-vector extraction from the perspective of the prior distribution exerted on the mean supervector of Gaussian mixture model (GMM). To this end, we start off with the analysis of the subspace prior which leads to the compressed representation in the standard i-vector extraction. We then propose the use of quasi-factorial prior and show how it impacts the total variability space and its application for i-vector extraction. The quasi-factorial prior could be used in a standalone manner, or in comb… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2016
2016
2018
2018

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 17 publications
0
1
0
Order By: Relevance
“…Supervectors can be regarded as representations of GMMs that differ only in their mixture means [15] and since the total variability model may only provide an incomplete representation for short durations utterances, direct modelling of the supervectors may be beneficial. Parameter tying across mixtures in the total variability model is relaxed in [16,17] and banks of local variability vectors or concatenated local vectors are obtained. G-PLDA was then trained on top them.…”
Section: Introductionmentioning
confidence: 99%
“…Supervectors can be regarded as representations of GMMs that differ only in their mixture means [15] and since the total variability model may only provide an incomplete representation for short durations utterances, direct modelling of the supervectors may be beneficial. Parameter tying across mixtures in the total variability model is relaxed in [16,17] and banks of local variability vectors or concatenated local vectors are obtained. G-PLDA was then trained on top them.…”
Section: Introductionmentioning
confidence: 99%