1994
DOI: 10.1109/78.324732
|View full text |Cite
|
Sign up to set email alerts
|

Space-alternating generalized expectation-maximization algorithm

Abstract: The expectation-maximization (EM) method can facilitate maximizing likelihood functions that arise in statistical estimation problems. In the classical EM paradigm, one iteratively maximizes the conditional log-likelihood of a single unobservable complete data space, rather than maximizing the intractable likelihood function for the measured or incomplete data. EM algorithms update all parameters simultaneously, which has two drawbacks: 1) slow convergence, and 2) difficult maximization steps due to coupling w… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
548
0
3

Year Published

1999
1999
2015
2015

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 918 publications
(552 citation statements)
references
References 26 publications
1
548
0
3
Order By: Relevance
“…One solution to this problem could be expectation-conditional maximization (ECM) (Meng and Rubin, 1993;Meng, 1994), which replaces the M-step by a series of computationally simplified conditional maximization (CM) steps. The ECM is a class of generalized EM (GEM) algorithms in which the Q-function is increased rather than being maximized (Fessler and Hero, 1994). For systems with a high number of states, the ensemble Kalman smoother (EnKS) (Evensen, 2003(Evensen, , 2009b provides an alternative to the RTS smoother used in this work.…”
Section: Estimation Algorithmmentioning
confidence: 99%
“…One solution to this problem could be expectation-conditional maximization (ECM) (Meng and Rubin, 1993;Meng, 1994), which replaces the M-step by a series of computationally simplified conditional maximization (CM) steps. The ECM is a class of generalized EM (GEM) algorithms in which the Q-function is increased rather than being maximized (Fessler and Hero, 1994). For systems with a high number of states, the ensemble Kalman smoother (EnKS) (Evensen, 2003(Evensen, , 2009b provides an alternative to the RTS smoother used in this work.…”
Section: Estimation Algorithmmentioning
confidence: 99%
“…So called generalized EM algorithms, which increase but do not necessarily maximally increase the likelihood at each iteration, have also been proposed (e.g., [FH94]). The EM recursion is sometimes regarded as having two separate steps: an "E step" which computes the conditional expectation (i.e., the integral) in (6.91), and an "M step" which performs the maximization in (6.91).…”
Section: Iterative Noncoherent Equalization Via the Em Algorithmmentioning
confidence: 99%
“…One of the most popular approach to obtain more efficient ML estimates is the EM algorithm [2]. To further improve the speed of convergence of the EM approach, SAGE algorithm has been proposed [3]. In SAGE, parameters are updated sequentially in lower dimensional parameter spaces.…”
Section: Maximum-likelihood Based Parameter Estimationmentioning
confidence: 99%
“…Various different forms of the EM algorithm have been developed to further improve the performance. The most popular one is the space alternating generalized EM (SAGE) algorithm, which was developed by Fessler and Hero [3]. In SAGE, parameters are updated sequentially in contrast with the EM where all the parameters are updated simultaneously.…”
Section: Introductionmentioning
confidence: 99%