Consistency and robustness of kernel based regression
We investigate properties of kernel based regression (KBR) methods which are inspired by the convex risk minimization method of support vector machines. We first describe the relation between the used loss function of the KBR method and the tail of the response variable Y . We then establish a consistency result for KBR and give assumptions for the existence of the influence function. In particular, our results allow to choose the loss function and the kernel to obtain computational tractable and consistent KBR methods having bounded influence functions. Furthermore, bounds for the sensitivity curve which is a finite sample version of the influence function are developed, and some numerical experiments are discussed.
Year of publication: |
2005
|
---|---|
Authors: | Christmann, Andreas ; Steinwart, Ingo |
Institutions: | Institut für Wirtschafts- und Sozialstatistik, Universität Dortmund |
Saved in:
freely available
Saved in favorites
Similar items by person
-
Robust Learning from Bites for Data Mining
Christmann, Andreas, (2006)
-
On robustness properties of convex risk minimization methods for pattern recognition
Christmann, Andreas, (2003)
-
Measuring overlap in logistic regression
Christmann, Andreas, (1999)
- More ...