Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

7-2020

Abstract

Stochastic variance-reduced gradient (SVRG) algorithms have been shown to work favorably in solving large-scale learning problems. Despite the remarkable success, the stochastic gradient complexity of SVRG-type algorithms usually scales linearly with data size and thus could still be expensive for huge data. To address this deficiency, we propose a hybrid stochastic-deterministic minibatch proximal gradient (HSDMPG) algorithm for strongly-convex problems that enjoys provably improved data-size-independent complexity guarantees.

Discipline

Databases and Information Systems

Research Areas

Intelligent Systems and Optimization

Areas of Excellence

Digital transformation

Publication

Proceedings of the 37th International Conference on Machine Learning (ICML 2020), Virtual Conference, July 13-18

First Page

1

Last Page

10

Publisher

NeurIPS

City or Country

Virtual Conference

Additional URL

https://proceedings.mlr.press/v119/zhou20g.html

Share

COinS