2020
DOI: 10.1109/access.2020.3040077
|View full text |Cite
|
Sign up to set email alerts
|

Publishing Node Strength Distribution With Node Differential Privacy

Abstract: The challenge of graph data publishing under node-differential privacy mainly comes from the high sensitivity of the query. Compared with edge-differential privacy that can only protect the relationship between people, node-differential privacy can protect both the relationship between people and personal information. Therefore, Node-differential privacy must pay attention to the protection of personal information. This paper studies the release of node strength distribution under node-differential privacy by … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 17 publications
0
4
0
Order By: Relevance
“…when spatio-temporal data is published [22]). As a result, there exist two concievable routes to perform private learning on such data: edge-level DP to protect the connections to other individuals in the graph and prevent unique identification of users like in [55,73,78,92,100] and node-level DP to protect the data of each individual itself (as well as the outgoing edges) like in [8,37,70,131]. Furthermore, Sajadmanesh et al [98] utilise locally differentially private GNNs in the context of social networks.…”
Section: Application Areas For Dp On Graphsmentioning
confidence: 99%
“…when spatio-temporal data is published [22]). As a result, there exist two concievable routes to perform private learning on such data: edge-level DP to protect the connections to other individuals in the graph and prevent unique identification of users like in [55,73,78,92,100] and node-level DP to protect the data of each individual itself (as well as the outgoing edges) like in [8,37,70,131]. Furthermore, Sajadmanesh et al [98] utilise locally differentially private GNNs in the context of social networks.…”
Section: Application Areas For Dp On Graphsmentioning
confidence: 99%
“…when spatio-temporal data is published [84]). As a result, there exist two sensible routes to perform private learning on such data: edge-level DP to protect the connections to other individuals in the graph and prevent unique identification of users like in [28,20,29] and node-level DP to protect the data of each individual itself (as well as the outgoing edges) like in [27,32,33,47,56,57]. Numerous works have previously been employed to allow private release of social graphs or their associated statistics [32,42,50].…”
Section: Social Networkmentioning
confidence: 99%
“…Any addition or deletion of one node has little effect on the query results. Liu et al [22] studied node strength distribution based on node differential privacy, reducing sensitivity by limiting weight and degree. Qian et al [23] proposed a privacy node strength publishing method based on edge differential privacy.…”
Section: Related Workmentioning
confidence: 99%
“…􏼈 􏼉 [32,33], and the value of privacy budget ε is between 0.5 and 1.5 [22,23]. Due to the randomness of Laplacian noise and in order to better measure the algorithm performance, for each algorithm, we use the mean error of 100 runs.…”
Section: Data Sets and Settingsmentioning
confidence: 99%