Publication Type
Conference Proceeding Article
Version
publishedVersion
Publication Date
5-2016
Abstract
Conventional online kernel methods often yield an unboundedlarge number of support vectors, making them inefficient and non-scalable forlarge-scale applications. Recent studies on bounded kernel-based onlinelearning have attempted to overcome this shortcoming. Although they can boundthe number of support vectors at each iteration, most of them fail to bound thenumber of support vectors for the final output solution which is often obtainedby averaging the series of solutions over all the iterations. In this paper, wepropose a novel kernel-based online learning method, Sparse Passive Aggressivelearning (SPA), which can output a final solution with a bounded number ofsupport vectors. The key idea of our method is to explore an efficientstochastic sampling strategy, which turns an example into a new support vectorwith some probability that depends on the loss suffered by the example. Wetheoretically prove that the proposed SPA algorithm achieves an optimal regretbound in expectation, and empirically show that the new algorithm outperformsvarious bounded kernel-based online learning algorithms.
Discipline
Computer Sciences | Databases and Information Systems | Theory and Algorithms
Research Areas
Data Science and Engineering
Publication
2016 SIAM International Conference on Data Mining: Miami, Florida, May 5-7, 2016: Proceedings
First Page
675
Last Page
683
ISBN
9781611974348
Identifier
10.1137/1.9781611974348.76
Publisher
SIAM
City or Country
Philadelphia, PA
Citation
LU, Jing; ZHAO, Peilin; and HOI, Steven C. H..
Online sparse passive aggressive learning with kernels. (2016). 2016 SIAM International Conference on Data Mining: Miami, Florida, May 5-7, 2016: Proceedings. 675-683.
Available at: https://ink.library.smu.edu.sg/sis_research/3416
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.1137/1.9781611974348.76