2006
DOI: 10.1016/j.jmva.2005.06.005
|View full text |Cite
|
Sign up to set email alerts
|

Corrected version of AIC for selecting multivariate normal linear regression models in a general nonnormal case

Abstract: This paper deals with the bias reduction of Akaike information criterion (AIC) for selecting variables in multivariate normal linear regression models when the true distribution of observation is an unknown nonnormal distribution. We propose a corrected version of AIC which is partially constructed by the jackknife method and is adjusted to the exact unbiased estimator of the risk when the candidate model includes the true model. It is pointed out that the influence of nonnormality in the bias of our criterion… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2006
2006
2017
2017

Publication Types

Select...
6

Relationship

3
3

Authors

Journals

citations
Cited by 21 publications
(15 citation statements)
references
References 13 publications
0
15
0
Order By: Relevance
“…However, Fujikoshi et al [5] pointed out that TIC in normal regression models hardly corrects the bias in actual use, because its bias correction term mainly consists of an estimator of the fourth cumulant of the true distribution. Such an estimator tends to underestimate too much, even if the sample size n is moderate (see [15]). …”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…However, Fujikoshi et al [5] pointed out that TIC in normal regression models hardly corrects the bias in actual use, because its bias correction term mainly consists of an estimator of the fourth cumulant of the true distribution. Such an estimator tends to underestimate too much, even if the sample size n is moderate (see [15]). …”
Section: Introductionmentioning
confidence: 99%
“…Therefore, unlike TIC, the CV criterion can correct the bias efficiently. Using the better property of the CV criterion, Yanagihara [14,15] proposed new criteria which are partially constructed by the cross-validation method, and which are slightly influenced by the difference between (y) and f (y| ). However, a bias for the risk exists also in the CV criterion.…”
Section: Introductionmentioning
confidence: 99%
“…Therefore, we can see that AIC J corrects the AIC's bias by replacingB AIC with a renewal term, as in TIC and EIC. However, compared with the first terms in asymptotic expansions of biases, it seems that the bias of AIC J tends to be smaller than the ones of TIC and EIC (see Yanagihara, 2006b). Moreover, from the numerical study in Yanagihara (2006b), we can see that the bias of AIC J becomes the smallest among the biases of AIC, TIC and EIC.…”
Section: Jackknifed Aicmentioning
confidence: 98%
“…Recently, Yanagihara (2006b) proposed a bias-corrected AIC which consists of a jackknife estimate of the bias. He evaluated the bias from the predicted residual sum of the squares (PRESS) and made a bias-correction term that was an exact unbiased term by multiplying the constant coefficient when the candidate model includes the true model.…”
Section: Introductionmentioning
confidence: 99%
“…Although we will consider primarily the above five criteria, the family also includes information criteria for which the penalty terms are random variables, e.g., the modified AIC (MAIC) proposed by Fujikoshi and Satoh (1997), Takeuchi's information criterion (TIC) proposed by Takeuchi (1976), the extended information criterion (EIC) proposed by Ishiguro et al (1997), the cross-validation (CV) criterion proposed by Stone (1974Stone ( , 1977, and other bias-corrected AICs, such as those proposed by Fujikoshi et al (2005), Yanagihara (2006), and Yanagihara et al (2011) (for the details of those criteria, see Yanagihara et al (2013)). The best subset of ω, which is chosen by minimizing IC m (j), is written aŝ…”
Section: Notation and Assumptionsmentioning
confidence: 99%