Publication Type

Conference Proceeding Article

Version

acceptedVersion

Publication Date

8-2025

Abstract

As a promising privacy-aware collaborative model training paradigm, Federated Learning (FL) is becoming popular in the design of distributed recommender systems. However, Federated Recommender Systems (FedRecs) greatly suffer from two major problems: i) extremely high communication overhead due to massive item embeddings involved in recommendation systems, and ii) intolerably low training efficiency caused by the entanglement of both heterogeneous network environments and client devices. Although existing methods attempt to employ various compression techniques to reduce communication overhead, due to the parameter errors introduced by model compression, they inevitably suffer from model performance degradation. To simultaneously address the above problems, this paper presents a communication-efficient FedRec framework named FedRAS, which adopts an action-sharing strategy to cluster the gradients of item embedding into a specific number of model updating actions for communication rather than directly compressing the item embeddings. In this way, the cloud server can use the limited actions from clients to update all the items. Since gradient values are significantly smaller than item embeddings, constraining the directions of gradients (i.e., the action space) introduces smaller errors compared to compressing the entire item embedding matrix into a reduced space. To accommodate heterogeneous devices and network environments, FedRAS incorporates an adaptive clustering mechanism that dynamically adjusts the number of actions. Comprehensive experiments on well-known datasets demonstrate that FedRAS can reduce the size of communication payloads by up to 96.88\%, while not sacrificing recommendation performance within various heterogeneous scenarios. We have open-sourced FedRAS at https://github.com/mastlab-T3S/FedRAS.

Keywords

federated learning, recommender systems, clustering, communication efficiency, gradient robustness

Discipline

Databases and Information Systems

Research Areas

Intelligent Systems and Optimization

Areas of Excellence

Digital transformation

Publication

KDD '25: Proceedings of the 31st ACM SIGKDD Conference on Knowledge Discovery and Data Mining V.2, Toronto, Canada, August 3-7

First Page

1999

Last Page

2009

ISBN

9798400714542

Identifier

10.1145/3711896.3736987

Publisher

Association for Computing Machinery

City or Country

Toronto ON, Canada

Additional URL

https://doi.org/10.1145/3711896.3736987

Share

COinS