2000
DOI: 10.1109/89.824696
|View full text |Cite
|
Sign up to set email alerts
|

Maximum likelihood and minimum classification error factor analysis for automatic speech recognition

Abstract: Abstract-Hidden Markov models (HMM's) for automatic speech recognition rely on high-dimensional feature vectors to summarize the short-time properties of speech. Correlations between features can arise when the speech signal is nonstationary or corrupted by noise. We investigate how to model these correlations using factor analysis, a statistical method for dimensionality reduction. Factor analysis uses a small number of parameters to model the covariance structure of high dimensional data. These parameters ca… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
30
0

Year Published

2007
2007
2017
2017

Publication Types

Select...
3
3
3

Relationship

1
8

Authors

Journals

citations
Cited by 52 publications
(30 citation statements)
references
References 23 publications
0
30
0
Order By: Relevance
“…From the definition of the factor analysis model (1) and of the distributions of the factor and the error vectors, we derive that conditional on the factors y j , the observations x j are independently distributed as t(µ + Λy j , Ψ, ν): 6 or, alternatively,…”
Section: A Mixtures Of Student's-t Factor Analyzersmentioning
confidence: 99%
See 1 more Smart Citation
“…From the definition of the factor analysis model (1) and of the distributions of the factor and the error vectors, we derive that conditional on the factors y j , the observations x j are independently distributed as t(µ + Λy j , Ψ, ν): 6 or, alternatively,…”
Section: A Mixtures Of Student's-t Factor Analyzersmentioning
confidence: 99%
“…The MFA model has been applied to a large variety of signal modeling and classification problems. Among its most typical applications, we mention speech recognition [6], textindependent speaker recognition [7], [8], face detection [9]. Nevertheless, MFA model suffers from a significant shortcoming, common to every GMM-based or GMM-related model: the model parameter estimation procedure can be adversely affected by outliers in the training data [3].…”
Section: Introductionmentioning
confidence: 99%
“…The above expression is based on factor analysis which allows each of the Gaussian components to be estimated separately using EM [151]. Factor-analysed HMMs [146] generalise this to support both tying over multiple components of the loading matrices and using GMMs to model the latent variable-space of z in Figure 3.3.…”
Section: Structured Covariance Matricesmentioning
confidence: 99%
“…It can be used as a structured covariance matrix scheme, as an alternative to using full covariance matrices [14], [15], [17]. The generative model for the shared FA [14] extension of FA can be written as…”
Section: B Relationship To Factor Analysismentioning
confidence: 99%