In this paper, we present a new framework for large scale online kernel learning, making kernel methods efficient and scalable for large-scale online learning applications. Unlike the regular budget online kernel learning scheme that usually uses some budget maintenance strategies to bound the number of support vectors, our framework explores a completely different approach of kernel functional approximation techniques to make the subsequent online learning task efficient and scalable. Specifically, we present two different online kernel machine learning algorithms: (i) Fourier Online Gradient Descent (FOGD) algorithm that applies the random Fourier features for approximating kernel functions; and (ii) Nyström Online Gradient Descent (NOGD) algorithm that applies the Nyström method to approximate large kernel matrices. We explore these two approaches to tackle three online learning tasks: binary classification, multi-class classification, and regression. The encouraging results of our experiments on large-scale datasets validate the effectiveness and efficiency of the proposed algorithms, making them potentially more practical than the family of existing budget online kernel learning approaches.
online learning, kernel method, large scale machine learning
Computer Sciences | Databases and Information Systems | Theory and Algorithms
Data Science and Engineering
Journal of Machine Learning Research
Journal of Machine Learning Research / Microtome Publishing
LU, Jing; HOI, Steven C. H.; WANG, Jialei; ZHAO, Peilin; and LIU, Zhi-Yong.
Large scale online kernel learning. (2016). Journal of Machine Learning Research. 17, (47), 1-43. Research Collection School Of Information Systems.
Available at: http://ink.library.smu.edu.sg/sis_research/3410
Copyright Owner and License
Creative Commons License
This work is licensed under a Creative Commons Attribution-Noncommercial-No Derivative Works 4.0 License.