2021
DOI: 10.48550/arxiv.2109.08907
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Releasing Graph Neural Networks with Differential Privacy Guarantees

Iyiola E. Olatunji,
Thorben Funke,
Megha Khosla

Abstract: With the increasing popularity of Graph Neural Networks (GNNs) in several sensitive applications like healthcare and medicine, concerns have been raised over the privacy aspects of trained GNNs. More notably, GNNs are vulnerable to privacy attacks, such as membership inference attacks, even if only blackbox access to the trained model is granted. To build defenses, differential privacy has emerged as a mechanism to disguise the sensitive data in training datasets. Following the strategy of Private Aggregation … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
10
0

Year Published

2021
2021
2022
2022

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(10 citation statements)
references
References 19 publications
0
10
0
Order By: Relevance
“…To protect the privacy of graph-structured data, many differential privacy preserving network embedding methods are investigated [136,152,208,228]. For instance, DPNE [208] applies perturbations on the objective function of learning network embeddings.…”
Section: Differentialmentioning
confidence: 99%
See 4 more Smart Citations
“…To protect the privacy of graph-structured data, many differential privacy preserving network embedding methods are investigated [136,152,208,228]. For instance, DPNE [208] applies perturbations on the objective function of learning network embeddings.…”
Section: Differentialmentioning
confidence: 99%
“…In [227], a perturbed gradient descent method that guarantees the privacy of graph embeddings learned by matrix factorization is proposed. More recently, several works [136,152] that focus on differentially private GNNs are explored. For example, the locally private GNN [152] adopts local differential privacy [92] to protect the privacy of node features by perturbing the user's features locally.…”
Section: Differentialmentioning
confidence: 99%
See 3 more Smart Citations