Publication Type

Journal Article

Version

acceptedVersion

Publication Date

9-2024

Abstract

Due to the powerful representation ability and superior performance of Deep Neural Networks (DNN), Federated Learning (FL) based on DNN has attracted much attention from both academic and industrial fields. However, its transmitted plaintext data causes privacy disclosure. FL based on Local Differential Privacy (LDP) solutions can provide privacy protection to a certain extent, but these solutions still cannot achieve adaptive perturbation in DNN model. In addition, this kind of schemes cause high communication overheads due to the curse of dimensionality of DNN, and are naturally vulnerable to backdoor attacks due to the inherent distributed characteristic. To solve these issues, we propose an E fficient and S ecure F ederated L earning scheme (ESFL) against backdoor attacks by using adaptive LDP and compressive sensing. Formal security analysis proves that ESFL satisfies ϵ -LDP security. Extensive experiments using three datasets demonstrate that ESFL can solve the problems of traditional LDP-based FL schemes without a loss of model accuracy and efficiently resist the backdoor attacks.

Keywords

Adaptation models, Adaptive local differential privacy, Artificial neural networks, Backdoor attacks, Compressive sensing, Federated learning, Federated learning, Gaussian noise, Privacy, Servers, Training

Discipline

Information Security

Research Areas

Cybersecurity

Publication

IEEE Transactions on Dependable and Secure Computing

Volume

21

Issue

5

First Page

4619

Last Page

4636

ISSN

1545-5971

Identifier

10.1109/TDSC.2024.3354736

Publisher

Institute of Electrical and Electronics Engineers

Copyright Owner and License

Authors

Additional URL

https://doi.org/10.1109/TDSC.2024.3354736

Share

COinS