Title: Robustness of reweighted least squares kernel based regression
Authors: Debruyne, Michiel ×
Christmann, Andreas
Hubert, Mia
Suykens, Johan #
Issue Date: 2010
Publisher: Elsevier
Series Title: Journal of Multivariate Analysis vol:101 issue:2 pages:447-463
Abstract: Kernel Based Regression (KBR) minimizes a convex risk over a possibly infinite dimensional reproducing kernel Hilbert space. Recently it was shown that KBR with a least squares loss function may have some undesirable properties from a robustness point of view: even very small amounts of outliers can dramatically affect the estimates. KBR with other loss functions is much more robust, but often gives
rise to more complicated computations (e.g. for Huber or logistic losses). In classical statistics robustness is often improved by reweighting the original estimate.
In this paper we provide a theoretical framework for reweighted KBR and analyze its robustness. Some important differences are found with respect to linear
regression, indicating that LS-KBR with a bounded kernel is much more suited for reweighting. Our results give practical guidelines for a good choice of weights,
providing robustness as well as fast convergence. In particular a logistic weight function seems an appropriate choice, not only to downweight outliers, but also to
improve performance at heavy tailed distributions. For the latter some heuristic arguments are given comparing concepts from robustness and stability.
ISSN: 0047-259X
Publication status: published
KU Leuven publication type: IT
Appears in Collections:Leuven Statistics Research Centre (LStat)
Statistics Section
ESAT - STADIUS, Stadius Centre for Dynamical Systems, Signal Processing and Data Analytics
× corresponding author
# (joint) last author

Files in This Item:

There are no files associated with this item.

Request a copy


All items in Lirias are protected by copyright, with all rights reserved.

© Web of science