Showing 1 - 10 of 11
Persistent link: https://www.econbiz.de/10003023786
Persistent link: https://www.econbiz.de/10003771780
Many methods of computational statistics lead to matrix-algebra or numerical- mathematics problems. For example, the least squares method in linear regression reduces to solving a system of linear equations. The principal components method is based on finding eigenvalues and eigenvectors of a...
Persistent link: https://www.econbiz.de/10003024181
Most dimension reduction methods based on nonparametric smoothing are highly sensitive to outliers and to data coming from heavy tailed distributions. We show that the recently proposed MAVE and OPG methods by Xia et al. (2002) allow us to make them robust in a relatively straightforward way...
Persistent link: https://www.econbiz.de/10010296438
Persistent link: https://www.econbiz.de/10010309973
Persistent link: https://www.econbiz.de/10010310226
Classical parametric estimation methods applied to nonlinear regression and limited-dependent-variable models are very sensitive to misspecification and data errors. On the other hand, semiparametric and nonparametric methods, which are not restricted by parametric assumptions, require more data...
Persistent link: https://www.econbiz.de/10010310330
Persistent link: https://www.econbiz.de/10010310409
The Nadaraya-Watson estimator of regression is known to be highly sensitive to the presence of outliers in the sample. A possible way of robustication consists in using local L-estimates of regression. Whereas the local L-estimation is traditionally done using an empirical conditional...
Persistent link: https://www.econbiz.de/10010310509
Persistent link: https://www.econbiz.de/10010310548