Publication Type
Conference Proceeding Article
Version
acceptedVersion
Publication Date
11-2017
Abstract
Learning to optimize AUC performance for classifying label imbalanced data in online scenarios has been extensively studied in recent years. Most of the existing work has attempted to address the problem directly in the original feature space, which may not suitable for non-linearly separable datasets. To solve this issue, some kernel-based learning methods are proposed for non-linearly separable datasets. However, such kernel approaches have been shown to be inefficient and failed to scale well on large scale datasets in practice. Taking this cue, in this work, we explore the use of scalable kernel-based learning techniques as surrogates to existing approaches: random Fourier features and Nyström method, for tackling the problem and bring insights to the differences between the two methods based on their online performance. In contrast to the conventional kernel-based learning methods which suffer from high computational complexity of the kernel matrix, our proposed approaches elevate this issue with linear features that approximate the kernel function/matrix. Specifically, two different surrogate kernel-based learning models are presented for addressing the online AUC maximization task: (i) the Fourier Online AUC Maximization (FOAM) algorithm that samples the basis functions from a data-independent distribution to approximate the kernel functions; and (ii) the Nyström Online AUC Maximization (NOAM) algorithm that samples a subset of instances from the training data to approximate the kernel matrix by a low rank matrix. Another novelty of the present work is the proposed mini-batch Online Gradient Descent method for model updating to control the noise and reduce the variance of gradients. We provide theoretical analyses for the two proposed algorithms. Empirical studies on commonly used large scale datasets show that the proposed algorithms outperformed existing state-of-the-art methods in terms of both AUC performance and computational efficiency.
Keywords
Kernel, Computational modeling, Approximation algorithms, Machine learning algorithms, Support vector machines, Learning systems
Discipline
Computer Sciences | Databases and Information Systems | Theory and Algorithms
Research Areas
Data Science and Engineering
Publication
2017 IEEE International Conference on Data Mining (ICDM): New Orleans, LA, November 18-21: Proceedings
First Page
91
Last Page
100
ISBN
9781538638354
Identifier
10.1109/ICDM.2017.18
Publisher
IEEE
City or Country
Piscataway, NJ
Citation
DING, Yi; LIU, Chenghao; ZHAO, Peilin; and HOI, Steven C. H..
Large scale kernel methods for online AUC maximization. (2017). 2017 IEEE International Conference on Data Mining (ICDM): New Orleans, LA, November 18-21: Proceedings. 91-100.
Available at: https://ink.library.smu.edu.sg/sis_research/3969
Copyright Owner and License
Authors/LARC
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.1109/ICDM.2017.18