Publication Type
Conference Proceeding Article
Version
acceptedVersion
Publication Date
12-2025
Abstract
Although Federated Learning (FL) is promising for privacy-preserving collaborative model training, it suffers from low inference performance due to heterogeneous client data. Due to heterogeneous data across clients, FL training easily learns client-specific overfitting features. Existing FL methods adopt coarsegrained averaging, which can easily cause the global model to get stuck in local optima, leading to poor generalization. Specifically, this paper presents a novel FL framework, FedPhoenix, to address this issue. It stochastically resets partial parameters in each round to destroy some features of the global model, guiding FL training to learn multiple generalized features for inference rather than specific overfitting features. Experimental results on various wellknown datasets demonstrate that compared to SOTA FL methods, FedPhoenix can achieve up to 20.73% higher accuracy. The implementation is publicly available at https://github.com/UniString/FedPhoenix.
Discipline
Artificial Intelligence and Robotics
Research Areas
Intelligent Systems and Optimization
Areas of Excellence
Digital transformation
Publication
Proceedings of the Thirty-Ninth Annual Conference on Neural Information Processing Systems, San Diego, CA, USA, 2025 December 2-7
First Page
1
Last Page
27
City or Country
US
Citation
WU, Jiahao; HU, Ming; YANG, Yanxin; XIE, Xiaofei; CHEN, ZeKai; SONG, Chenyu; and CHEN, Mingsong.
Rising from ashes: Generalized federated learning via dynamic parameter reset. (2025). Proceedings of the Thirty-Ninth Annual Conference on Neural Information Processing Systems, San Diego, CA, USA, 2025 December 2-7. 1-27.
Available at: https://ink.library.smu.edu.sg/sis_research/10721
Creative Commons License

This work is licensed under a Creative Commons Attribution-NonCommercial-No Derivative Works 4.0 International License.