Achieving efficient and privacy-preserving neural network training and prediction in cloud environments

Publication Type

Journal Article

Publication Date

10-2023

Abstract

The neural network has been widely used to train predictive models for applications such as image processing, disease prediction, and face recognition. To produce more accurate models, powerful third parties (e.g., clouds) are usually employed to collect data from a large number of users, which however may raise concerns about user privacy. In this paper, we propose an Efficient and Privacy-preserving Neural Network scheme, named EPNN, to deal with the privacy issues in cloud-based neural networks. EPNN is designed based on a two-cloud model and techniques of data perturbation and additively homomorphic cryptosystem. This scheme enables two clouds to cooperatively perform neural network training and prediction in a privacy-preserving manner and significantly reduces the computation and communication overhead among participating entities. Through a detailed analysis, we demonstrate the security of EPNN. Extensive experiments based on real-world datasets show EPNN is more efficient than existing schemes in terms of computational costs and communication overhead.

Keywords

Privacy-preserving, neural network, data perturbation, additively homomorphic cryptosystem, cloud environments

Discipline

Databases and Information Systems | OS and Networks

Publication

IEEE Transactions on Dependable and Secure Computing

Volume

20

Issue

5

First Page

4245

Last Page

4257

ISSN

1545-5971

Identifier

10.1109/TDSC.2022.3208706

Publisher

Institute of Electrical and Electronics Engineers

Additional URL

https://doi.org/10.1109/TDSC.2022.3208706

This document is currently not available here.

Share

COinS