Publication Type
Conference Proceeding Article
Version
acceptedVersion
Publication Date
12-2022
Abstract
To enable large-scale machine learning in bandwidth-hungry environments such as wireless networks, significant progress has been made recently in designing communication-efficient federated learning algorithms with the aid of communication compression. On the other end, privacy-preserving, especially at the client level, is another important desideratum that has not been addressed simultaneously in the presence of advanced communication compression techniques yet. In this paper, we propose a unified framework that enhances the communication efficiency of private federated learning with communication compression. Exploiting both general compression operators and local differential privacy, we first examine a simple algorithm that applies compression directly to differentially-private stochastic gradient descent, and identify its limitations. We then propose a unified framework SoteriaFL for private federated learning, which accommodates a general family of local gradient estimators including popular stochastic variance-reduced gradient methods and the state-of-the-art shifted compression scheme. We provide a comprehensive characterization of its performance trade-offs in terms of privacy, utility, and communication complexity, where SoteraFL is shown to achieve better communication complexity without sacrificing privacy nor utility than other private federated learning algorithms without communication compression.
Discipline
Databases and Information Systems
Research Areas
Data Science and Engineering; Intelligent Systems and Optimization
Publication
Proceedings of the 36th Conference on Neural Information Processing Systems (NeurIPS 2022), Hybrid Conference, November 28 - December 9
First Page
1
Last Page
39
Publisher
Neural Information Processing Systems Foundation
City or Country
New Orleans, USA
Citation
LI, Zhize; ZHAO, Haoyu; LI, Boyue; and CHI, Yuejie.
SoteriaFL: A unified framework for private federated learning with communication compression. (2022). Proceedings of the 36th Conference on Neural Information Processing Systems (NeurIPS 2022), Hybrid Conference, November 28 - December 9. 1-39.
Available at: https://ink.library.smu.edu.sg/sis_research/8688
Creative Commons License
This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.
Additional URL
https://proceedings.neurips.cc/paper_files/paper/2022/hash/1b645a77cf48821afc3ee7e5b5d42617-Abstract-Conference.html