Publication Type
Conference Proceeding Article
Version
acceptedVersion
Publication Date
12-2022
Abstract
Communication efficiency has been widely recognized as the bottleneck for large-scale decentralized machine learning applications in multi-agent or federated environments. To tackle the communication bottleneck, there have been many efforts to design communication-compressed algorithms for decentralized nonconvex optimization, where the clients are only allowed to communicate a small amount of quantized information (aka bits) with their neighbors over a predefined graph topology. Despite significant efforts, the state-of-the-art algorithm in the nonconvex setting still suffers from a slower rate of convergence $O((G/T)^{2/3})$ compared with their uncompressed counterpart, where $G$ measures the data heterogeneity across different clients, and $T$ is the number of communication rounds. This paper proposes BEER, which adopts communication compression with gradient tracking, and shows it converges at a faster rate of $O(1/T)$. This significantly improves over the state-of-the-art rate, by matching the rate without compression even under arbitrary data heterogeneity. Numerical experiments are also provided to corroborate our theory and confirm the practical superiority of beer in the data heterogeneous regime.
Discipline
Databases and Information Systems
Research Areas
Data Science and Engineering; Intelligent Systems and Optimization
Publication
Proceedings of the 36th Conference on Neural Information Processing Systems (NeurIPS 2022), Hybrid Conference, November 28 - December 9
First Page
1
Last Page
26
Publisher
Neural Information Processing Systems Foundation
City or Country
New Orleans, USA
Citation
ZHAO, Haoyu; LI, Boyue; LI, Zhize; RICHTARIK, Peter; and CHI, Yuejie.
BEER: Fast O(1/T) rate for decentralized nonconvex optimization with communication compression. (2022). Proceedings of the 36th Conference on Neural Information Processing Systems (NeurIPS 2022), Hybrid Conference, November 28 - December 9. 1-26.
Available at: https://ink.library.smu.edu.sg/sis_research/8687
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://proceedings.neurips.cc/paper_files/paper/2022/hash/cd86c6a804d925c4cbc5a7b96843f6d5-Abstract-Conference.html