Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

7-2012

Abstract

Kernel-based online learning has often shown state-of-the-art performance for many online learning tasks. It, however, suffers from a major shortcoming, that is, the unbounded number of support vectors, making it non-scalable and unsuitable for applications with large-scale datasets. In this work, we study the problem of bounded kernel-based online learning that aims to constrain the number of support vectors by a predefined budget. Although several algorithms have been proposed in literature, they are neither computationally efficient due to their intensive budget maintenance strategy nor effective due to the use of simple Perceptron algorithm. To overcome these limitations, we propose a framework for bounded kernel-based online learning based on an online gradient descent approach. We propose two efficient algorithms of bounded online gradient descent (BOGD) for scalable kernel-based online learning: (i) BOGD by maintaining support vectors using uniform sampling, and (ii) BOGD++ by maintaining support vectors using non-uniform sampling. We present theoretical analysis of regret bound for both algorithms, and found promising empirical performance in terms of both efficacy and efficiency by comparing them to several well-known algorithms for bounded kernel-based online learning on large-scale datasets.

Keywords

Computationally efficient, Empirical performance, Gradient descent, Gradient descent algorithms, Large-scale datasets

Discipline

Computer Sciences | Databases and Information Systems | Theory and Algorithms

Research Areas

Data Science and Engineering

Publication

Proceedings of the Twenty-Ninth International Conference on Machine Learning: June 26 - July 1, Edinburgh, Scotland

First Page

169

Last Page

176

ISBN

9781450312851

Publisher

International Machine Learning Society

City or Country

Madison, WI

Copyright Owner and License

Authors

Additional URL

https://icml.cc/2012/papers/108.pdf

Share

COinS