2004
DOI: 10.1016/j.patrec.2004.03.019
|View full text |Cite
|
Sign up to set email alerts
|

Almost autonomous training of mixtures of principal component analyzers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2004
2004
2017
2017

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 9 publications
0
4
0
Order By: Relevance
“…Exhaustive search over all possible models is inefficient due to the large number of candidate models to be considered. A fast sub-optimal algorithm with Bayesian information criterion (BIC) [14,26] was proposed to address the model selection problem.…”
Section: Propositionmentioning
confidence: 99%
“…Exhaustive search over all possible models is inefficient due to the large number of candidate models to be considered. A fast sub-optimal algorithm with Bayesian information criterion (BIC) [14,26] was proposed to address the model selection problem.…”
Section: Propositionmentioning
confidence: 99%
“…According to the criterion of variance ratio, the retained PCs should explain in excess of say, 90%, of the total variance in the data (Musa et al, 2004).…”
Section: Model Selectionmentioning
confidence: 99%
“…In a fast model selection algorithm was proposed with the restrictive assumption that all mixtures 6 have the same number of PCs (Q k ). To allow different Q k 's while attaining computational efficiency, Musa et al (2004) proposed to apply the criterion of variance ratio for the selection of PCs within each mixture, and developed a greedy training algorithm to determine the number of mixtures. Despite the fact that this two-stage algorithm is sub-optimal, it was shown (Musa et al, 2004) to achieve promising performance in practice.…”
Section: Model Selectionmentioning
confidence: 99%
See 1 more Smart Citation