Showing 81 - 90 of 100
The present method allows to detect outlying observations in data which may be described by a deterministic function plus a stochastic component. This type of functional relationship often occurs in experimental data, in toxicological research, for instance. The Hampel identifier, an outlier...
Persistent link: https://www.econbiz.de/10010955362
The aim of detecting outliers in a multivariate sample can be pursued in different ways. We investigate here the performance of several simultaneous multivariate outlier identification rules based on robust estimators of location and scale. It has been shown that the use of estimators with high...
Persistent link: https://www.econbiz.de/10010955369
In their paper, Davies and Gather (1993) formalized the task of outlier identification, considering also certain performance criteria for outlier identifiers. One of those Criteria, the maximum asymptotic bias, is carried over here to multivariate outlier identifiers. We show how this term...
Persistent link: https://www.econbiz.de/10010955406
In investigations on the behaviour of robust estimators, typically their consistency and their asymptotic normality are studied as a necessity. Their rates of convergence, however, are often given less weight. We show here that the rate of convergence of a multivariate robust estimator to its...
Persistent link: https://www.econbiz.de/10010955430
In this paper, we consider one-step outlier identification rules for multivariate data-generalizing the concept of so-called a - outlier identifiers_ as presented in Davies and Gather (1993) for the case of univariate samples. We investigate how the finite sample breakdown points of estimators...
Persistent link: https://www.econbiz.de/10010955452
Methods of dimension reduction are very helpful and almost a necessity if we want to analyze high-dimensional time series since otherwise modelling affords many parameters because of interactions at various time-lags. We use a dynamic version of Sliced Inverse Regression (SIR; Li (1991)), which...
Persistent link: https://www.econbiz.de/10010955462
Persistent link: https://www.econbiz.de/10010955467
Persistent link: https://www.econbiz.de/10010955486
Sliced Inverse Regression (SIR) is a promising technique for the purpose of dimension reduction. Several properties of this relatively new method have been examined already, but little attention has been paid to robustness aspects. We show that SIR is very sensitive towards outliers in the data....
Persistent link: https://www.econbiz.de/10010955507
Sliced inverse regression (SIR) is a clever technique for reducing the dimension of the predictor in regression problems, thus avoiding the curse of dimensionality. There exist many contributions on various aspects of the performance of SIR. Up to now, few attention has been paid to the problem...
Persistent link: https://www.econbiz.de/10009216897