Publication Type

Journal Article

Version

acceptedVersion

Publication Date

3-2024

Abstract

Due to the communication bottleneck in distributed and federated learning applications, algorithms using communication compression have attracted significant attention and are widely used in practice. Moreover, the huge number, high heterogeneity, and limited availability of clients result in high client -variance. This paper addresses these two issues together by proposing compressed and clientvariance reduced methods COFIG and FRECON. We prove an O( (1+\omega)3/2\surdN+ (1+\omega)N2/3 S\epsilon2 S\epsilon2 ) bound on the number of communication rounds of COFIG in the nonconvex setting, where N is the total number of clients, S is the number of clients participating in each round, \epsilon is the convergence error, and \omega is the variance parameter associated with the compression operator. In case of FRECON, \surd we prove an O( (1+\omega) S\epsilon2 ) bound on the number of communication rounds. In the convex setting, N \surd COFIG converges within O((1+\omega) S\epsilon) communication rounds, which, to the best of our knowledge, N is also the first convergence result for compression schemes that do not communicate with all the clients in each round. We stress that neither COFIG nor FRECON needs to communicate with all the clients, and they enjoy the first or faster convergence results for convex and nonconvex federated learning in the regimes considered. Experimental results point to an empirical superiority of COFIG and FRECON over existing baselines.

Keywords

federated learning, distributed optimization, communication compression, variance reduction

Discipline

Databases and Information Systems | Theory and Algorithms

Research Areas

Intelligent Systems and Optimization

Publication

SIAM Journal on Mathematics of Data Science

Volume

6

Issue

1

First Page

154

Last Page

175

ISSN

2577-0187

Identifier

10.1137/23M1553820

Publisher

Society for Industrial and Applied Mathematics

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1137/23M1553820

Share

COinS