Corrected version of AIC for selecting multivariate normal linear regression models in a general nonnormal case
This paper deals with the bias reduction of Akaike information criterion (AIC) for selecting variables in multivariate normal linear regression models when the true distribution of observation is an unknown nonnormal distribution. We propose a corrected version of AIC which is partially constructed by the jackknife method and is adjusted to the exact unbiased estimator of the risk when the candidate model includes the true model. It is pointed out that the influence of nonnormality in the bias of our criterion is smaller than the ones in AIC and TIC. We verify that our criterion is better than the AIC, TIC and EIC by conducting numerical experiments.
Year of publication: |
2006
|
---|---|
Authors: | Yanagihara, Hirokazu |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 97.2006, 5, p. 1070-1089
|
Publisher: |
Elsevier |
Keywords: | Bias reduction Influence of nonnormality Kullback-Leibler information Jackknife method Model misspecification Normal assumption Predicted residuals Selection of variables Robustness |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
"Tests for Covariance Matrices in High Dimension with Less Sample Size"
Srivastava, Muni S., (2014)
-
Fujikoshi, Yasunori, (2003)
-
Fujikoshi, Yasunori, (2005)
- More ...