Learning in Graphical Models 1998
DOI: 10.1007/978-94-011-5014-9_12
|View full text |Cite
|
Sign up to set email alerts
|

A View of the Em Algorithm that Justifies Incremental, Sparse, and other Variants

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

10
1,379
0
15

Year Published

2001
2001
2017
2017

Publication Types

Select...
6
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 1,535 publications
(1,404 citation statements)
references
References 3 publications
10
1,379
0
15
Order By: Relevance
“…Initially (Hinton and von Cramp, 1993;MacKay, 1995a,b) the free energy was described in terms of description lengths and coding. Later, established methods like EM were considered in the light of variational free energy (Neal and Hinton, 1998; see also Bishop, 1999). Variational learning can be regarded as subsuming most other learning schemes as special cases.…”
Section: The Variational Approachmentioning
confidence: 99%
“…Initially (Hinton and von Cramp, 1993;MacKay, 1995a,b) the free energy was described in terms of description lengths and coding. Later, established methods like EM were considered in the light of variational free energy (Neal and Hinton, 1998; see also Bishop, 1999). Variational learning can be regarded as subsuming most other learning schemes as special cases.…”
Section: The Variational Approachmentioning
confidence: 99%
“…In order for this convergence property to apply, it is necessary that the bound h(θ|θ k ) touches the objective function. Previous work has, however, shown that 'approximate' EM can be effective in practice despite lacking an analytic convergence guarantee [14], [19]. In the following sections we show that exact EM is intractable in the hybrid model learning problem, but propose a tractable, approximate EM approach that is effective in practice.…”
Section: Review Of Expectation-maximizationmentioning
confidence: 84%
“…The E-step can be viewed as the search over the space of distributions q of the latent variables W, keeping the parameters Θ fixed (3.7), and the M-step can be interpreted to be the search over the parameter space, keeping the latent variables distribution q fixed (3.8). The cost function for the EM is given by [15]:…”
Section: Inference For Probabilistic Pca Via Variational Emmentioning
confidence: 99%