1999
DOI: 10.1111/1467-9868.00176
|View full text |Cite
|
Sign up to set email alerts
|

Maximizing Generalized Linear Mixed Model Likelihoods With an Automated Monte Carlo EM Algorithm

Abstract: Two new implementations of the EM algorithm are proposed for maximum likelihood ®tting of generalized linear mixed models. Both methods use random (independent and identically distributed) sampling to construct Monte Carlo approximations at the E-step. One approach involves generating random samples from the exact conditional distribution of the random effects (given the data) by rejection sampling, using the marginal distribution as a candidate. The second method uses a multivariate t importance sampling appr… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
418
0
2

Year Published

2000
2000
2017
2017

Publication Types

Select...
6
4

Relationship

0
10

Authors

Journals

citations
Cited by 473 publications
(421 citation statements)
references
References 55 publications
1
418
0
2
Order By: Relevance
“…For the MCEM method for the joint covariate and response model, we start with k 0 = 100 Monte Carlo samples and increase the Monte Carlo sample size 23 as the number of iteration t increases: k t+1 = k t + k t ∕c with c = 4. Convergence of the EM algorithm was considered to be achieved when the maximum percentage change of all estimates was less than 0.1% in 2 consecutive iterations.…”
Section: The Models and Analysis Resultsmentioning
confidence: 99%
“…For the MCEM method for the joint covariate and response model, we start with k 0 = 100 Monte Carlo samples and increase the Monte Carlo sample size 23 as the number of iteration t increases: k t+1 = k t + k t ∕c with c = 4. Convergence of the EM algorithm was considered to be achieved when the maximum percentage change of all estimates was less than 0.1% in 2 consecutive iterations.…”
Section: The Models and Analysis Resultsmentioning
confidence: 99%
“…In general, the approximation method for parameter estimation in GLMM consists of: (a) simplification of problem analytically (using Laplace approximation to integrate the likelihood function), such as Penalized Quasi Likelihood [1] and Hierarchical Generalized Linear models or HGLM [2] (b) computer intensive methods, such as Monte Carlo EM algorithm [3], Markov Chain Monte Carlo or MCMC [4] and Gauss-Hermite quadrature or GHQ [5]. The Laplace approximation approximates the integrand, PQL approximates the data and AGQ approximates the integral.…”
Section: Introductionmentioning
confidence: 99%
“…In the E-step, a Gibbs sampler is used to sample the random effects from their posterior distributions and the M-step is the estimation of a generalized linear mixed model. Booth and Hobert (1999) implemented MCEM employing importance sampling in the E-step and Vaida and Meng (2005) employing a slice sampler to fit generalized linear models with crossed random effects. This algorithm is also computationally expensive requiring many draws of the random effects in the E-step to achieve a sufficiently small Monte Carlo error close to convergence.…”
Section: Introductionmentioning
confidence: 99%