Publication Type

Conference Proceeding Article

Publication Date

5-2016

Abstract

Conventional online kernel methods often yield an unboundedlarge number of support vectors, making them inefficient and non-scalable forlarge-scale applications. Recent studies on bounded kernel-based onlinelearning have attempted to overcome this shortcoming. Although they can boundthe number of support vectors at each iteration, most of them fail to bound thenumber of support vectors for the final output solution which is often obtainedby averaging the series of solutions over all the iterations. In this paper, wepropose a novel kernel-based online learning method, Sparse Passive Aggressivelearning (SPA), which can output a final solution with a bounded number ofsupport vectors. The key idea of our method is to explore an efficientstochastic sampling strategy, which turns an example into a new support vectorwith some probability that depends on the loss suffered by the example. Wetheoretically prove that the proposed SPA algorithm achieves an optimal regretbound in expectation, and empirically show that the new algorithm outperformsvarious bounded kernel-based online learning algorithms.

Discipline

Computer Sciences | Databases and Information Systems | Theory and Algorithms

Research Areas

Data Management and Analytics

Publication

2016 SIAM International Conference on Data Mining: May 5-7, 2016, Miami, Florida, USA

First Page

675

Last Page

683

Identifier

10.1137/1.9781611974348.76

Publisher

SIAM

City or Country

Philadelphia, PA

Creative Commons License

Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.

Additional URL

http://doi.org/10.1137/1.9781611974348.76

Share

COinS