Publication Type

Journal Article

Version

publishedVersion

Publication Date

4-2016

Abstract

In this paper, we present a new framework for large scale online kernel learning, making kernel methods efficient and scalable for large-scale online learning applications. Unlike the regular budget online kernel learning scheme that usually uses some budget maintenance strategies to bound the number of support vectors, our framework explores a completely different approach of kernel functional approximation techniques to make the subsequent online learning task efficient and scalable. Specifically, we present two different online kernel machine learning algorithms: (i) Fourier Online Gradient Descent (FOGD) algorithm that applies the random Fourier features for approximating kernel functions; and (ii) Nyström Online Gradient Descent (NOGD) algorithm that applies the Nyström method to approximate large kernel matrices. We explore these two approaches to tackle three online learning tasks: binary classification, multi-class classification, and regression. The encouraging results of our experiments on large-scale datasets validate the effectiveness and efficiency of the proposed algorithms, making them potentially more practical than the family of existing budget online kernel learning approaches.

Keywords

online learning, kernel method, large scale machine learning

Discipline

Computer Sciences | Databases and Information Systems | Theory and Algorithms

Research Areas

Data Science and Engineering

Publication

Journal of Machine Learning Research

Volume

17

Issue

47

First Page

1

Last Page

43

ISSN

1532-4435

Publisher

Journal of Machine Learning Research / Microtome Publishing

Copyright Owner and License

Authors

Additional URL

http://jmlr.org/papers/volume17/14-148/14-148.pdf

Share

COinS