Publication Type

Journal Article

Version

publishedVersion

Publication Date

8-2024

Abstract

Federated Learning (FL) ensures collaborative learning among multiple clients while maintaining data locally. However, the traditional synchronous FL solutions have lower accuracy and require more communication time in scenarios where most devices drop out during learning. Therefore, we propose an Asynchronous Federated Learning (AsyFL) scheme using time-weighted and stale model aggregation, which effectively solves the problem of poor model performance due to the heterogeneity of devices. Then, we integrate Symmetric Homomorphic Encryption (SHE) into AsyFL to propose Asynchronous Privacy-Preserving Federated Learning (Asy-PPFL), which protects the privacy of clients and achieves lightweight computing. Privacy analysis shows that Asy-PPFL is indistinguishable under Known Plaintext Attack (KPA) and convergence analysis proves the effectiveness of our schemes. A large number of experiments show that AsyFL and Asy-PPFL can achieve the highest accuracy of 58.40% and 58.26% on Cifar-10 dataset when most clients (i.e., 80%) are offline or delayed, respectively.

Keywords

Computational modelling, Convergence, Federated learning, Heterogeneity, Homomorphic encryptions, Homomorphic-encryptions, Lightweight computing, Privacy, Symmetric homomorphic encryption, Symmetrics

Discipline

Databases and Information Systems | Theory and Algorithms

Research Areas

Data Science and Engineering; Cybersecurity; Information Systems and Management

Publication

IEEE Transactions on Dependable and Secure Computing

Volume

21

Issue

4

First Page

2361

Last Page

2375

ISSN

1545-5971

Identifier

10.1109/TDSC.2023.3304788

Publisher

Institute of Electrical and Electronics Engineers

Additional URL

https://doi.org/10.1109/TDSC.2023.3304788

Share

COinS