Showing 1 - 10 of 36
In high-dimensional data analysis, feature selection becomes one effective means for dimension reduction, which proceeds with parameter estimation. Concerning accuracy of selection and estimation, we study nonconvex constrained and regularized likelihoods in the presence of nuisance parameters....
Persistent link: https://www.econbiz.de/10010971175
High-dimensional feature selection has become increasingly crucial for seeking parsimonious models in estimation. For selection consistency, we derive one necessary and sufficient condition formulated on the notion of degree of separation. The minimal degree of separation is necessary for any...
Persistent link: https://www.econbiz.de/10011000073
Persistent link: https://www.econbiz.de/10009981400
Persistent link: https://www.econbiz.de/10010141625
Persistent link: https://www.econbiz.de/10010182626
In high-dimensional regression, grouping pursuit and feature selection have their own merits while complementing each other in battling the curse of dimensionality. To seek a parsimonious model, we perform simultaneous grouping pursuit and feature selection over an arbitrary undirected graph...
Persistent link: https://www.econbiz.de/10010690642
Persistent link: https://www.econbiz.de/10008738445
Persistent link: https://www.econbiz.de/10003752192
It is not unusual for the response variable in a regression model to be subject to censoring or truncation. Tobit regression models are a specific example of such a situation, where for some observations the observed response is not the actual response, but rather the censoring value...
Persistent link: https://www.econbiz.de/10012769195
Influence diagnosis is important since presence of influential observations could lead to distorted analysis and misleading interpretations. For high dimensional data, it is particularly so, as the increased dimensionality and complexity may amplify both the chance of an observation being...
Persistent link: https://www.econbiz.de/10013076985