Publication Type

Journal Article

Version

acceptedVersion

Publication Date

12-2008

Abstract

Robust regression techniques are critical to fitting data with noise in real-world applications. Most previous work of robust kernel regression is usually formulated into a dual form, which is then solved by some quadratic program solver consequently. In this correspondence, we propose a new formulation for robust regularized kernel regression under the theoretical framework of regularization networks and then tackle the optimization problem directly in the primal. We show that the primal and dual approaches are equivalent to achieving similar regression performance, but the primal formulation is more efficient and easier to be implemented than the dual one. Different from previous work, our approach also optimizes the bias term. In addition, we show that the proposed solution can be easily extended to other noise-reliable loss function, including the Huber-epsiv insensitive loss function. Finally, we conduct a set of experiments on both artificial and real data sets, in which promising results show that the proposed method is effective and more efficient than traditional approaches.

Keywords

Kernel regression, regularized least squares (RLS), robust estimator, support vector machine (SVM)

Discipline

Databases and Information Systems

Research Areas

Data Science and Engineering

Publication

IEEE Transactions on Systems, Man and Cybernetics, Part B (TSMC)

Volume

38

Issue

6

First Page

1639

Last Page

1644

ISSN

1083-4419

Identifier

10.1109/TSMCB.2008.927279

Publisher

IEEE

Additional URL

https://doi.org/10.1109/TSMCB.2008.927279

Share

COinS