Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

6-2019

Abstract

Variance reduction techniques like SVRG provide simple and fast algorithms for optimizing a convex finite-sum objective. For nonconvex objectives, these techniques can also find a first-order stationary point (with small gradient). However, in nonconvex optimization it is often crucial to find a second-order stationary point (with small gradient and almost PSD hessian). In this paper, we show that Stabilized SVRG (a simple variant of SVRG) can find an $\epsilon$-second-order stationary point using only $\tilde{O}(n^{2/3}/\epsilon^2 + n/\epsilon^{1.5})$ stochastic gradients. To our best knowledge, this is the first second-order guarantee for a simple variant of SVRG. The running time almost matches the known guarantees for finding $\epsilon$-first-order stationary points.

Discipline

Databases and Information Systems

Research Areas

Data Science and Engineering; Intelligent Systems and Optimization

Publication

Proceedings of the 32nd Conference on Learning Theory (COLT 2019), Phoenix, USA, June 25-28

Volume

99

First Page

1394

Last Page

1448

Publisher

Proceedings of Machine Learning Research

City or Country

Phoenix, USA

Additional URL

https://proceedings.mlr.press/v99/ge19a

Share

COinS