2018
DOI: 10.1111/rssb.12270
|View full text |Cite
|
Sign up to set email alerts
|

Asymptotic Properties and Information Criteria for Misspecified Generalized Linear Mixed Models

Abstract: Summary The problem of misspecification poses challenges in model selection. The paper studies the asymptotic properties of estimators for generalized linear mixed models with misspecification under the framework of conditional Kullback–Leibler divergence. A conditional generalized information criterion is introduced, and a model selection procedure is proposed by minimizing the criterion. We prove that the model selection procedure proposed is asymptotically loss efficient when all the candidate models are mi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
4
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 14 publications
(4 citation statements)
references
References 40 publications
0
4
0
Order By: Relevance
“…The first part of Condition 3 requires inf w∈ Q * n (w) to grow at a rate no slower than n 1∕2 and excludes the situation where the true model lies within the set of candidate models. Similar assumption was imposed in condition (8) of Ando and Li [1] and theorem 2 of Yu et al [36]. The second part of Condition 3 poses a mild assumption on the quasi-true value of the Poisson count component.…”
Section: Discussionmentioning
confidence: 93%
“…The first part of Condition 3 requires inf w∈ Q * n (w) to grow at a rate no slower than n 1∕2 and excludes the situation where the true model lies within the set of candidate models. Similar assumption was imposed in condition (8) of Ando and Li [1] and theorem 2 of Yu et al [36]. The second part of Condition 3 poses a mild assumption on the quasi-true value of the Poisson count component.…”
Section: Discussionmentioning
confidence: 93%
“…Pu and Niu (2006) show the asymptotic loss efficiency of GIC for selecting the fixed‐effects models given the correct random‐effects models. Yu et al (2018) show the asymptotic loss efficiency of conditional generalized information criterion (CGIC) for selecting generalized linear mixed‐effects models when the random effects are predicted by treating them as fixed ones. Although it is common to predict random effects using the empirical best linear unbiased predictors (EBLUPs)—see Robinson (1991) and Bates et al (2015)—to the best of our knowledge, no asymptotic loss efficiency has been derived for EBLUPs.…”
Section: Introductionmentioning
confidence: 99%
“…These results are not directly applicable for generalized mixed models (Saefken, Kneib, van Waveren, & Greven, ). However Yu, Zhang, and Yau () propose a conditional generalized information criterion based on the conditional Kullback–Leibler divergence for possibly misspecified data modeled by a generalized linear mixed model. In a recent contribution, Sakamoto () introduces a bias reduction for the marginal AIC.…”
Section: Introductionmentioning
confidence: 99%