Publication Type
Journal Article
Version
publishedVersion
Publication Date
8-2022
Abstract
Emerging applications in multiagent environments such as internet-of-things, networked sensing, autonomous systems, and federated learning, call for decentralized algorithms for finite-sum optimizations that are resource efficient in terms of both computation and communication. In this paper, we consider the prototypical setting where the agents work collaboratively to minimize the sum of local loss functions by only communicating with their neighbors over a predetermined network topology. We develop a new algorithm, called DEcentralized STochastic REcurSive gradient methodS (DESTRESS) for nonconvex finite-sum optimization, which matches the optimal incremental first-order oracle complexity of centralized algorithms for finding first-order stationary points, while maintaining communication efficiency. Detailed theoretical and numerical comparisons corroborate that the resource efficiencies of DESTRESS improve upon prior decentralized algorithms over a wide range of parameter regimes. DESTRESS leverages several key algorithm design ideas including stochastic recursive gradient updates with minibatches for local computation, gradient tracking with extra mixing (i.e., multiple gossiping rounds) for periteration communication, together with careful choices of hyperparameters and new analysis frameworks to provably achieve a desirable computation-communication trade-off.
Keywords
decentralized optimization, nonconvex finite-sum optimization, stochastic recursive gradient methods
Discipline
Databases and Information Systems
Research Areas
Data Science and Engineering; Intelligent Systems and Optimization
Publication
SIAM Journal on Mathematics of Data Science
Volume
4
Issue
3
First Page
1031
Last Page
1051
Identifier
10.1137/21M1450677
Publisher
Society for Industrial and Applied Mathematics
Citation
LI, Boyue; LI, Zhize; and CHI, Yuejie.
DESTRESS: Computation-optimal and communication-efficient decentralized nonconvex finite-sum optimization. (2022). SIAM Journal on Mathematics of Data Science. 4, (3), 1031-1051.
Available at: https://ink.library.smu.edu.sg/sis_research/8691
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://doi.org/10.1137/21M1450677