Publication Type
Conference Proceeding Article
Version
acceptedVersion
Publication Date
7-2020
Abstract
Stochastic variance-reduced gradient (SVRG) algorithms have been shown to work favorably in solving large-scale learning problems. Despite the remarkable success, the stochastic gradient complexity of SVRG-type algorithms usually scales linearly with data size and thus could still be expensive for huge data. To address this deficiency, we propose a hybrid stochastic-deterministic minibatch proximal gradient (HSDMPG) algorithm for strongly-convex problems that enjoys provably improved data-size-independent complexity guarantees.
Discipline
Databases and Information Systems
Research Areas
Intelligent Systems and Optimization
Areas of Excellence
Digital transformation
Publication
Proceedings of the 37th International Conference on Machine Learning (ICML 2020), Virtual Conference, July 13-18
First Page
1
Last Page
10
Publisher
NeurIPS
City or Country
Virtual Conference
Citation
ZHOU, Pan and YUAN, Xiaotong.
Hybrid stochastic-deterministic minibatch proximal gradient: Less-than-single-pass optimization with nearly optimal generalization. (2020). Proceedings of the 37th International Conference on Machine Learning (ICML 2020), Virtual Conference, July 13-18. 1-10.
Available at: https://ink.library.smu.edu.sg/sis_research/9030
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://proceedings.mlr.press/v119/zhou20g.html