Nonconcave penalized inverse regression in single-index models with high dimensional predictors
In this paper we aim to estimate the direction in general single-index models and to select important variables simultaneously when a diverging number of predictors are involved in regressions. Towards this end, we propose the nonconcave penalized inverse regression method. Specifically, the resulting estimation with the SCAD penalty enjoys an oracle property in semi-parametric models even when the dimension, pn, of predictors goes to infinity. Under regularity conditions we also achieve the asymptotic normality when the dimension of predictor vector goes to infinity at the rate of pn=o(n1/3) where n is sample size, which enables us to construct confidence interval/region for the estimated index. The asymptotic results are augmented by simulations, and illustrated by analysis of an air pollution dataset.
Year of publication: |
2009
|
---|---|
Authors: | Zhu, Li-Ping ; Zhu, Li-Xing |
Published in: |
Journal of Multivariate Analysis. - Elsevier, ISSN 0047-259X. - Vol. 100.2009, 5, p. 862-875
|
Publisher: |
Elsevier |
Keywords: | 62H15 62G20 Dimension reduction Diverging parameters Inverse regression SCAD Sparsity |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
On a dimension reduction regression with covariate adjustment
Zhang, Jun, (2012)
-
Dimension Reduction in Regressions Through Cumulative Slicing Estimation
Zhu, Li-Ping, (2010)
-
Model-Free Feature Screening for Ultrahigh-Dimensional Data
Zhu, Li-Ping, (2011)
- More ...