Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

1-2015

Abstract

Learning for maximizing AUC performance is an important research problem in machine learning. Unlike traditional batch learning methods for maximizing AUC which often suffer from poor scalability, recent years have witnessed some emerging studies that attempt to maximize AUC by single-pass online learning approaches. Despite their encouraging results reported, the existing online AUC maximization algorithms often adopt simple stochastic gradient descent approaches, which fail to exploit the geometry knowledge of the data observed in the online learning process, and thus could suffer from relatively slow convergence. To overcome the limitation of the existing studies, in this paper, we propose a novel algorithm of Adaptive Online AUC Maximization (AdaOAM), by applying an adaptive gradient method for exploiting the knowledge of historical gradients to perform more informative online learning. The new adaptive updating strategy by AdaOAM is less sensitive to parameter settings due to its natural effect of tuning the learning rate. In addition, the time complexity of the new algorithm remains the same as the previous non-adaptive algorithms. To demonstrate the effectiveness of the proposed algorithm, we analyze its theoretical bound, and further evaluate its empirical performance on both public benchmark datasets and anomaly detection datasets. The encouraging empirical results clearly show the effectiveness and efficiency of the proposed algorithm.

Keywords

Adaptive algorithms, Adaptive gradient methods, Benchmark datasets, Effectiveness and efficiencies, Empirical performance, Nonadaptive algorithm, Simple stochastic, Theoretical bounds, Updating strategy

Discipline

Computer Sciences | Databases and Information Systems | Theory and Algorithms

Research Areas

Data Science and Engineering

Publication

Proceedings of the Twenty-Ninth AAAI Conference on Artificial Intelligence: January 25-30, 2015, Austin

First Page

2568

Last Page

2574

ISBN

9781577357025

Publisher

AAAI Press

City or Country

Palo Alto, CA

Copyright Owner and License

Authors

Additional URL

https://www.aaai.org/ocs/index.php/AAAI/AAAI15/paper/view/9500

Share

COinS