A note on min--maxbias estimators in approximately linear models
The approximately linear model represents deviations from the ideal linear model by a vector contained in a prescribed bias-ball. In a recent paper Mathew and Nordstrom (1993) proposed min--maxbias estimators in which a criterion function is defined by maximizing errors over the bias-ball. When the Chebyshev norm defines the bias-ball they found the least absolute deviation or L1 estimator to be identical to its maxbias version. This was thought to be a robustness property since it contrasts with least squares where the maxbias criterion is a combination of L1 and the sum of squares. In this paper it is shown, however, that equivalence between the L1 estimator and its min--maxbias version is not special to L1 and that the equivalence is valid for estimates that are not robust. Hence, while the L1 estimate does have desirable robustness properties the equivalence to its min--maxbias version cannot be counted as one of them.
Year of publication: |
1994
|
---|---|
Authors: | Bassett, Gilbert W. |
Published in: |
Statistics & Probability Letters. - Elsevier, ISSN 0167-7152. - Vol. 21.1994, 1, p. 27-28
|
Publisher: |
Elsevier |
Keywords: | Approximately linear model Least squares Least absolute deviation |
Saved in:
Online Resource
Saved in favorites
Similar items by person
-
Pessimistic portfolio allocation and choquet expected utility
Bassett, Gilbert W., (2004)
-
Conceptualizing inequality and risk
Persky, Joseph, (2006)
-
March madness, quantile regression bracketology, and the Hayek hypothesis
Koenker, Roger, (2010)
- More ...