"Estimation of Parameters in a Linear Regression Model under the Kullback-Leibler Loss"
This paper is concerned with the simultaneous estimation of parameters of regression coefficients and error variance in a linear regression model. Motivated from the Akaike information criterion, the expected Kullback-Leibler distance is employed as a risk function for comparing estimators in a decision-theoretic framework. This setup gives us the difficulty in handling the risk because an estimator of the variance is incorporated into the loss for estimating the regression coefficients. In this situation, several estimators of the variance and the regression coefficients are proposed and shown to improve on usual estimators used as a benchmark. Through simulation studies for the risk behavior of estimators, it is numerically shown that a truncated estimator has more favorable risk than the usual estimators.
Year of publication: |
2005-11
|
---|---|
Authors: | Kubokawa, Tatsuya ; Tsukuma, Hisayuki |
Institutions: | Center for International Research on the Japanese Economy (CIRJE), Faculty of Economics |
Saved in:
Saved in favorites
Similar items by person
-
"Unified Improvements in Estimation of a Normal Covariance Matrix in High and Low Dimesions"
Tsukuma, Hisayuki, (2014)
-
"A Unified Approach to Estimating a Normal Mean Matrix in High and Low Dimensions"
Tsukuma, Hisayuki, (2014)
-
"Simultaneous estimation of normal precision matrices"
Tsukuma, Hisayuki, (2006)
- More ...