Showing 1 - 10 of 231
In this paper, the cross-validation methods namely the <InlineEquation ID="IEq1"> <EquationSource Format="TEX">$$C_{p}$$</EquationSource> <EquationSource Format="MATHML"> <math xmlns:xlink="http://www.w3.org/1999/xlink"> <msub> <mi>C</mi> <mi>p</mi> </msub> </math> </EquationSource> </InlineEquation>, PRESS and GCV are presented under the multiple linear regression model when multicollinearity exists and additional information imposes restrictions among the parameters that should hold in exact terms. The selection...</equationsource></equationsource></inlineequation>
Persistent link: https://www.econbiz.de/10011241324
In this study a new two-parameter estimator which includes the ordinary least squares, the principal components regression (PCR) and the Liu-type estimator is proposed. Conditions for the superiority of this new estimator over the PCR, r–k class estimator and Liu-type estimator are derived....
Persistent link: https://www.econbiz.de/10011151888
Persistent link: https://www.econbiz.de/10009324798
Persistent link: https://www.econbiz.de/10010558280
In this paper, an improved ridge type estimator is introduced to overcome the effect of multi-collinearity in logistic regression. The proposed estimator is called a modified almost unbiased ridge logistic estimator. It is obtained by combining the ridge estimator and the almost unbiased ridge...
Persistent link: https://www.econbiz.de/10013444149
Persistent link: https://www.econbiz.de/10001081584
Persistent link: https://www.econbiz.de/10012156806
Persistent link: https://www.econbiz.de/10011795220
In this paper, an improved ridge type estimator is introduced to overcome the effect of multi-collinearity in logistic regression. The proposed estimator is called a modified almost unbiased ridge logistic estimator. It is obtained by combining the ridge estimator and the almost unbiased ridge...
Persistent link: https://www.econbiz.de/10013428849
We give a new consistency proof for high-dimensional quantile regression estimators. A consequence of this proof is that the number of significant regressors can grow at a rate slog2(s)=o(n). To our best knowledge, this is the fastest rate achieved for high-dimensional quantile regression.
Persistent link: https://www.econbiz.de/10010678734