2020
DOI: 10.1016/j.ins.2020.03.074
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-Preserving distributed deep learning based on secret sharing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 40 publications
(6 citation statements)
references
References 15 publications
0
6
0
Order By: Relevance
“…However, the existing federated learning mostly relies on the central server to generate global model parameters, which is a centralized architecture, which has problems such as single point of failure, privacy leakage, and performance bottlenecks. In order to solve the above problems, some scholars use encryption methods such as homomorphic encryption [2][3][4][5] and secure multiparty computing [6][7][8][9] to encrypt model parameters sent by participants to resist model inference attacks. Another part of scholars proposed the method of differential privacy [10,11], which makes it difficult for attackers to infer the original model parameters by adding disturbance noise to the model parameters.…”
Section: Introductionmentioning
confidence: 99%
“…However, the existing federated learning mostly relies on the central server to generate global model parameters, which is a centralized architecture, which has problems such as single point of failure, privacy leakage, and performance bottlenecks. In order to solve the above problems, some scholars use encryption methods such as homomorphic encryption [2][3][4][5] and secure multiparty computing [6][7][8][9] to encrypt model parameters sent by participants to resist model inference attacks. Another part of scholars proposed the method of differential privacy [10,11], which makes it difficult for attackers to infer the original model parameters by adding disturbance noise to the model parameters.…”
Section: Introductionmentioning
confidence: 99%
“…Finally, we would like to suggest some of the possible scopes, shortly for researcher and practitioner as a brainstorming concept: reducing the reaction time and maximizing VM's resource allocation considering the QoS factor; improving the load stability in WSN using RCNN learning; SVM-PSO based community Forensics and RNN techniques for Intrusion Detection. Feature selection from natural algorithm Koroniotis et al [133] Quality of service NF PSO and DL Enhance NF Al hawaitat et al [134] WS PNS PSO Jamming attack Shi et al [135] Anomaly detection P ADAID 1 Presented unsupervised clustering Usman et al [96] VM allocation VR EFPA 2 Energy-oriented allocation Singh et al [103] VM migration VR HBGA 3 Energy reduction Naik et al [130] VM allocation VR Fruit fly Reduce host migration Meng & Pan [136] Optimization VR FFOA solve MKP 4 Mosa & Paton [126] VM placement VR GA Reduce response time & maximize resources utilization Duan et al [137] Information leakage P DL Protect server Festag & Spreckelsen [138] Data leakage P DL Detection of protected health information Chari et al [125] Quality of service IA DL Generate password via cognitive information Li et al [139] Signal processing IA GA Feature extraction via EEG signal Saini & Kansal [127] WSN ACS SI Reduce energy consumption and increase network life time Chen et al [140] Biometric identification IA CNN Proposed GSLT-CNN using human brain EEG Cao & Fang [141] Multilayer defense scenario ACS SI Found proficient IPSO elucidating extensive WTA problem Aliyu et al [124] Resource allocation ACS Ant colony Illustrated faster convergence optimize makespan time Poonia [142] VAN ACS SI Found significant difference in VANET routing protocol and Swarm based protocol Verma et al [ [129] Feature extraction ID GA Reduce features to classify network packet Tan et al [148] Real time network attack intrusion ID NN Able to detect in network precisely…”
Section: Discussionmentioning
confidence: 99%
“…Given the concerns of protecting patient information, medical data are often the property of individual institutions, and there is a lack of data-sharing systems to link institutions. Fortunately, this obstacle is beginning to be overcome, with privacy-preserving distributed DL (DDL) and multicenter data-sharing agreements [137][138][139]. DDL provides a privacy-preserving solution to enable multiple parties to jointly learn via a deep model without explicitly sharing local datasets.…”
Section: Data Access and Medical Ethicsmentioning
confidence: 99%