2021
DOI: 10.1016/j.cose.2021.102402
|View full text |Cite
|
Sign up to set email alerts
|

Privacy preservation in federated learning: An insightful survey from the GDPR perspective

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
100
0
2

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
5

Relationship

0
10

Authors

Journals

citations
Cited by 180 publications
(102 citation statements)
references
References 30 publications
0
100
0
2
Order By: Relevance
“…In ML-based IDS, CTI is challenging to achieve, as benign and malicious training data sets from several organisations are required for model training, which often poses privacy and security concerns. More importantly, recently introduced laws such as the General Data Protection Regulation (GDPR) [20] in Europe aim to protect data privacy and address unauthorised sharing of sensitive user information. Infraction of such laws often attracts serious legal issues and hefty fines, in the case of GDPR, the fine can be up to $20 million [21].…”
Section: Cyber Threat Intelligencementioning
confidence: 99%
“…In ML-based IDS, CTI is challenging to achieve, as benign and malicious training data sets from several organisations are required for model training, which often poses privacy and security concerns. More importantly, recently introduced laws such as the General Data Protection Regulation (GDPR) [20] in Europe aim to protect data privacy and address unauthorised sharing of sensitive user information. Infraction of such laws often attracts serious legal issues and hefty fines, in the case of GDPR, the fine can be up to $20 million [21].…”
Section: Cyber Threat Intelligencementioning
confidence: 99%
“…1, SIC is adopted to decode model parameters of each device from the received superposition signal. In this way, BS is capable of distinguishing malicious devices who mount attacks on the system via analyzing statistic characteristics of the decoded model parameters [10]. Beyond that, memorizing aggregated global model parameters benefits the convergence of the federated learning process through techniques like momentum based stochastic gradient descent [11] as well.…”
Section: A System Modelmentioning
confidence: 99%
“…A more challenging scenario is to combine both the data-free and multi-source settings, i.e., the source data collected from multiple domains with distinct distributions is not accessible due to some practical constraints. For example, federated learning [Truong et al, 2021] aggregates the information learned from a group of heterogeneous users. To preserve user privacy, the data of each user is stored locally, and only the trained models are transmitted to the central server.…”
Section: Introductionmentioning
confidence: 99%