Publication Type

Conference Proceeding Article

Version

publishedVersion

Publication Date

8-2013

Abstract

In this work, we present a new framework for large scale online kernel classification, making kernel methods efficient and scalable for large-scale online learning tasks. Unlike the regular budget kernel online learning scheme that usually uses different strategies to bound the number of support vectors, our framework explores a functional approximation approach to approximating a kernel function/matrix in order to make the subsequent online learning task efficient and scalable. Specifically, we present two different online kernel machine learning algorithms: (i) the Fourier Online Gradient Descent (FOGD) algorithm that applies the random Fourier features for approximating kernel functions; and (ii) the Nyström Online Gradient Descent (NOGD) algorithm that applies the Nyström method to approximate large kernel matrices. We offer theoretical analysis of the proposed algorithms, and conduct experiments for large-scale online classification tasks with some data set of over 1 million instances. Our encouraging results validate the effectiveness and efficiency of the proposed algorithms, making them potentially more practical than the family of existing budget kernel online learning approaches.

Discipline

Computer Sciences | Databases and Information Systems | Numerical Analysis and Scientific Computing

Research Areas

Data Science and Engineering

Publication

Proceedings of the Twenty-Third International Joint Conference on Artificial Intelligence: August 3-9, 2013, Beijing

First Page

1750

Last Page

1756

ISBN

9781577356332

Publisher

AAAI Press

City or Country

Menlo Park, CA

Additional URL

https://www.ijcai.org/Proceedings/13/Papers/259.pdf

Share

COinS