1997
DOI: 10.2307/2533552
|View full text |Cite
|
Sign up to set email alerts
|

Linear Mixed Models with Heterogeneous within-Cluster Variances

Abstract: This paper describes an extension of linear mixed models to allow for heterogeneous within-cluster variances in the analysis of clustered data. Unbiased estimating equations based on quasilikelihood/pseudolikelihood and method of moments are introduced and are shown to give consistent estimators of the regression coefficients, variance components, and heterogeneity parameter under regularity conditions. Cluster-specific random effects and variances are predicted by the posterior modes. The method is illustrate… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4

Citation Types

1
43
0

Year Published

1999
1999
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 41 publications
(44 citation statements)
references
References 23 publications
1
43
0
Order By: Relevance
“…The technical report by Cleveland, Denby, and Liu (2002) provides a detailed description of this general class of models and summarizes much of the relevant work. These models have been developed using both Bayesian (Lindley, 1971;Leonard, 1975;Myles et al, 2003) and frequentist approaches (James et al, 1994;Chinchilli, Esinhart, and Miller, 1995;Lin et al, 1997). Most of these authors take the random scale distribution to be square root inverse gamma, though some consider the log-normal distribution.…”
Section: Introductionmentioning
confidence: 99%
“…The technical report by Cleveland, Denby, and Liu (2002) provides a detailed description of this general class of models and summarizes much of the relevant work. These models have been developed using both Bayesian (Lindley, 1971;Leonard, 1975;Myles et al, 2003) and frequentist approaches (James et al, 1994;Chinchilli, Esinhart, and Miller, 1995;Lin et al, 1997). Most of these authors take the random scale distribution to be square root inverse gamma, though some consider the log-normal distribution.…”
Section: Introductionmentioning
confidence: 99%
“…Modelling the heterogeneity and identifying covariates that are related to variance can fully characterize the intraindividual variation [5] and provide a better understanding of the research problems [16] . Studying the heteroscedasticity of the model is not only of practical interest, but is also of important theoretical significance.…”
Section: Introductionmentioning
confidence: 99%
“…Vonesh [26] also extended several estimation procedures for nonlinear mixed models in which the variance was related to the mean response. Lin, et al [16] estimated the regression coefficients, variance components, and heterogeneity parameters in linear mixed models based on quasi-likelihood and method of moments. Other researchers employed Bayesian approaches [7,15] and quantile regression methods [29] to assess the heterogeneity of residual variances in mixed models and multilevel models.…”
Section: Introductionmentioning
confidence: 99%
“…Pourahmadi and Daniels [4] develop a class of models they call dynamic conditionally linear mixed models in which the marginal covariance matrix is allowed to vary across individuals, but they consider the random effects covariance matrix to be constant across subjects. In the context of linear mixed models, Lin et al, [5] examined heterogeneity in the within-individual variances in linear mixed models and Zhang and Weiss [6] discussed heterogeneity in the random effects covariance matrix but mainly consider models that allow the entire matrix to differ by a multiplicative factor. Little work has been done on modelling the entire random effects covariance matrix.…”
Section: Introductionmentioning
confidence: 99%
“…Table of models fit with W i, kj (design vector for GARP) and H ik (design vector for IV) defined as in (4) and (5). Posterior means and 95 per cent credible intervals for depression score at week 16 and the change from baseline for the best model.…”
mentioning
confidence: 99%