2018
DOI: 10.1016/j.ecosta.2017.04.003
|View full text |Cite
|
Sign up to set email alerts
|

Model comparison for generalized linear models with dependent observations

Abstract: Abstract. The stochastic expansion of the marginal quasi-likelihood function associated with a class of generalized linear models is shown. Based on the expansion, a quasi-Bayesian information criterion is proposed that is able to deal with misspecified models and dependent data, resulting in a theoretical extension of the classical Schwarz's Bayesian information criterion. It is also proved that the proposed criterion has model selection consistency with respect to the optimal model. Some illustrative numeric… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(2 citation statements)
references
References 27 publications
0
2
0
Order By: Relevance
“…Because of this, there is greater flexibility between the mean of the variable of interest and the linear predictor η. The GLM is an extension of the linear regression models, as they have a linear structure and consists of three components ( Eguchi, 2017 ; Cordeiro and Andrade, 2009 Gracindo et al., 2011 ): random component represented by a set of independent values Y 1 , Y 2 , ..., Y n with distribution belonging to the exponential family; systematic component enters the model as the linear sum of the effects of the explanatory variables that is given by; where the model matrix, the parameter vector and T the linear predictor; link function (g) that makes the connection between the average of the observations and the systematic part. …”
Section: Methodsmentioning
confidence: 99%
“…Because of this, there is greater flexibility between the mean of the variable of interest and the linear predictor η. The GLM is an extension of the linear regression models, as they have a linear structure and consists of three components ( Eguchi, 2017 ; Cordeiro and Andrade, 2009 Gracindo et al., 2011 ): random component represented by a set of independent values Y 1 , Y 2 , ..., Y n with distribution belonging to the exponential family; systematic component enters the model as the linear sum of the effects of the explanatory variables that is given by; where the model matrix, the parameter vector and T the linear predictor; link function (g) that makes the connection between the average of the observations and the systematic part. …”
Section: Methodsmentioning
confidence: 99%
“…Among others, well-known model selection criteria include the Akaike information criterion (AIC) [1,2] and Bayesian information criterion (BIC) [32], where the former is based on the Kullback-Leibler (KL) divergence principle of model selection and the latter is originated from the Bayesian principle of model selection. A great deal of work has been devoted to understanding and extending these model selection criteria to different model settings; see, for example, [4,20,25,24,8,9,27,30,11,23]. The connections between the AIC and cross-validation have been investigated in [34,21,31] for various contexts.…”
Section: Introductionmentioning
confidence: 99%