Prediction Error Property of the Lasso Estimator and its Generalization
The lasso procedure is an estimator-shrinkage and variable selection method. This paper shows that there always exists an interval of tuning parameter values such that the corresponding mean squared prediction error for the lasso estimator is smaller than for the ordinary least squares estimator. For an estimator satisfying some condition such as unbiasedness, the paper defines the corresponding generalized lasso estimator. Its mean squared prediction error is shown to be smaller than that of the estimator for values of the tuning parameter in some interval. This implies that all unbiased estimators are not admissible. Simulation results for five models support the theoretical results. Copyright 2003 Australian Statistical Publishing Association Inc..
Year of publication: |
2003
|
---|---|
Authors: | Huang, Fuchun |
Published in: |
Australian & New Zealand Journal of Statistics. - Australian Statistical Publishing Association Inc.. - Vol. 45.2003, 2, p. 217-228
|
Publisher: |
Australian Statistical Publishing Association Inc. |
Saved in:
freely available
Saved in favorites
Similar items by person
-
The geometric ergodicity and existence of moments for a class of non-linear time series model
An, Hongzhi, (1997)
-
Generalized Pseudo-Likelihood Estimates for Markov Random Fields on Lattice
Huang, Fuchun, (2002)
-
Generalized Pseudo-Likelihood Estimates for Markov Random Fields on Lattice
Huang, Fuchun, (2002)
- More ...