2020
DOI: 10.48550/arxiv.2004.04095
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deep Normalization for Speaker Vectors

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
14
0

Year Published

2020
2020
2020
2020

Publication Types

Select...
5
1

Relationship

4
2

Authors

Journals

citations
Cited by 7 publications
(15 citation statements)
references
References 51 publications
1
14
0
Order By: Relevance
“…For LDA, the best performance is obtained when the dimension is reduced to 200. For subspace DNF, we can only reduce the dimension to 400, and most performance improvement is not due to dimension reduction, but the nonlinear normalization provided by the full-dimension DNF [13]. Nonetheless, the results show that a nonlinear dimension reduction can be achieved by subspace DNF, and it may lead to additional performance gain over nonlinear normalization.…”
Section: Data and Settingmentioning
confidence: 94%
See 3 more Smart Citations
“…For LDA, the best performance is obtained when the dimension is reduced to 200. For subspace DNF, we can only reduce the dimension to 400, and most performance improvement is not due to dimension reduction, but the nonlinear normalization provided by the full-dimension DNF [13]. Nonetheless, the results show that a nonlinear dimension reduction can be achieved by subspace DNF, and it may lead to additional performance gain over nonlinear normalization.…”
Section: Data and Settingmentioning
confidence: 94%
“…To solve this problem, we extended NF to DNF, a discriminative normalization flow [13]. It is simply a NF but each class owns its individual class mean:…”
Section: Dnf: Real Deep Ldamentioning
confidence: 99%
See 2 more Smart Citations
“…This is not a good property for downstream applications that require discrimination within the latent space, e.g., pronunciation proficiency. Recently, the authors proposed a discriminative NF (DNF) model to deal with this problem [26]. The main advantage of DNF is that it allows each class to have its own Gaussian prior, i.e.…”
Section: Discriminative Nfmentioning
confidence: 99%