2020 IEEE 13th International Conference on Cloud Computing (CLOUD) 2020
DOI: 10.1109/cloud49709.2020.00039
|View full text |Cite
|
Sign up to set email alerts
|

MYSTIKO: Cloud-Mediated, Private, Federated Gradient Descent

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
1
1

Relationship

1
7

Authors

Journals

citations
Cited by 9 publications
(7 citation statements)
references
References 16 publications
0
7
0
Order By: Relevance
“…It is safe to assume that 2 + 2 + 2 ≥ 1. Thus, substituting from ( 16) in (18), we obtain that, for each ∈ {1, . .…”
Section: A Impracticality Results and Proofsmentioning
confidence: 99%
See 1 more Smart Citation
“…It is safe to assume that 2 + 2 + 2 ≥ 1. Thus, substituting from ( 16) in (18), we obtain that, for each ∈ {1, . .…”
Section: A Impracticality Results and Proofsmentioning
confidence: 99%
“…The idea mainly consists in adding DP noise to the gradients computed by the different workers, as explained in Section 2.3. Other techniques have investigated encrypting the gradients shared in the distributed network [18,35] to prevent a passive attacker from violating the privacy of data nodes by simply intercepting the gradients exchanged. However, the Byzantine resilience aspect of distributed SGD is outside the scope of these works: the presented solutions do not account for Byzantine data nodes that can disrupt the training by sending erroneous gradients.…”
Section: Related Workmentioning
confidence: 99%
“…Other existing technical approaches also prevent a passive attacker from violating the privacy of data and information leakage by exploiting the global model outputs using malicious model updates. For example, the work in [236] presents a novel cryptographic key generation and sharing approach that leverages additive homomorphic encryption to maximize the confidentiality of federated gradient descent in the training of deep neural networks without any loss of accuracy.…”
Section: Security and Privacy Challengesmentioning
confidence: 99%
“…Privacy issue in optimization has gained increasing attention in recent years, in both non-distributed settings [1,25,53,94,96] and distributed settings [57,70,77,107,119]. In distributed settings, some research proposes encryption-based methods to prevent passive attackers from intercepting the exchanged information between agents in the network [70,107], while others utilizes differential privacy [77,119], a gold standard notion for privacy-preserving in data [31].…”
Section: Fault-tolerance and Privacymentioning
confidence: 99%