ICASSP 2021 - 2021 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) 2021
DOI: 10.1109/icassp39728.2021.9413811
|View full text |Cite
|
Sign up to set email alerts
|

Graph-Homomorphic Perturbations for Private Decentralized Learning

Abstract: Decentralized algorithms for stochastic optimization and learning rely on the diffusion of information through repeated local exchanges of intermediate estimates. Such structures are particularly appealing in situations where agents may be hesitant to share raw data due to privacy concerns. Nevertheless, in the absence of additional privacy-preserving mechanisms, the exchange of local estimates, which are generated based on private data can allow for the inference of the data itself. The most common mechanism … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
11
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
4
2
1

Relationship

3
4

Authors

Journals

citations
Cited by 7 publications
(11 citation statements)
references
References 22 publications
0
11
0
Order By: Relevance
“…In particular, we introduce the graph federated architec-ture, which consists of multiple servers, and we privatize the algorithm by ensuring the communication ocuring between the servers and the clients is secure. Graph homomorphic perturbations, which were initially introduced in [6], focus on the communication between servers. They are based on adding correlated noise to the messages sent between servers such that the noise cancels out if we were to take the average of all messages across all servers.…”
Section: Introductionmentioning
confidence: 99%
“…In particular, we introduce the graph federated architec-ture, which consists of multiple servers, and we privatize the algorithm by ensuring the communication ocuring between the servers and the clients is secure. Graph homomorphic perturbations, which were initially introduced in [6], focus on the communication between servers. They are based on adding correlated noise to the messages sent between servers such that the noise cancels out if we were to take the average of all messages across all servers.…”
Section: Introductionmentioning
confidence: 99%
“…Contrarily, the agent-server noise components' subscripts are separated by a comma to highlight a hierarchical structure. Thus, the privatized algorithm can be written as a client update step (6), a server aggregation step (7), and a server combination step (8):…”
Section: Graph Federated Architecturementioning
confidence: 99%
“…To achieve this, we introduce graph homomorphic perturbations [7] defined as follows. We assume each server p draws a sample g p,i independently from the Laplace distribution Lap(0, σ g / √ 2) with variance σ 2 g .…”
Section: Graph Homomorphic Perturbationsmentioning
confidence: 99%
See 1 more Smart Citation
“…On the other hand, differentially private methods mask the messages by adding some random noise [36]- [46]. They are simple to implement, but they introduce errors into the learned model and reduce the overall utility of the network.…”
Section: Introductionmentioning
confidence: 99%