Showing 1 - 7 of 7
Our aim is to construct a factor analysis method that can resist the effect of outliers. For this we start with a highly robust initial covariance estimator, after which the factors can be obtained from maximum likelihood or from principal factor analysis (PFA). We find that PFA based on the...
Persistent link: https://www.econbiz.de/10005221368
In this paper we investigate the robustness properties of the deepest regression, a method for linear regression introduced by Rousseeuw and Hubert [6]. We show that the deepest regression functional is Fisher-consistent for the conditional median, and has a breakdown value of in all dimensions....
Persistent link: https://www.econbiz.de/10005221661
For multivariate data, the halfspace depth function can be seen as a natural and affine equivariant generalization of the univariate empirical cdf. For any multivariate data set, we show that the resulting halfspace depth function completely determines the empirical distribution. We do this by...
Persistent link: https://www.econbiz.de/10005152998
Deepest regression (DR) is a method for linear regression introduced by P. J. Rousseeuw and M. Hubert (1999, J. Amer. Statis. Assoc.94, 388-402). The DR method is defined as the fit with largest regression depth relative to the data. In this paper we show that DR is a robust method, with...
Persistent link: https://www.econbiz.de/10005093712
Motivated by the notion of regression depth (Rousseeuw and Hubert, 1996) we introduce thecatline, a new method for simple linear regression. At any bivariate data setZn={(xi, yi);i=1, ..., n} its regression depth is at leastn/3. This lower bound is attained for data lying on a convex or...
Persistent link: https://www.econbiz.de/10005093787
Support vector machines (SVMs) have attracted much attention in theoretical and in applied statistics. The main topics of recent interest are consistency, learning rates and robustness. We address the open problem whether SVMs are qualitatively robust. Our results show that SVMs are...
Persistent link: https://www.econbiz.de/10009023467
Kernel Based Regression (KBR) minimizes a convex risk over a possibly infinite dimensional reproducing kernel Hilbert space. Recently, it was shown that KBR with a least squares loss function may have some undesirable properties from a robustness point of view: even very small amounts of...
Persistent link: https://www.econbiz.de/10008521101