2020 IEEE 36th International Conference on Data Engineering (ICDE) 2020
DOI: 10.1109/icde48307.2020.00050
|View full text |Cite
|
Sign up to set email alerts
|

Providing Input-Discriminative Protection for Local Differential Privacy

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
52
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 43 publications
(52 citation statements)
references
References 17 publications
0
52
0
Order By: Relevance
“…Some existing work [ 57 , 203 ] aimed to propose personalized LDP-based frameworks for private histogram estimation. Gu et al [ 59 ] presented Input-Discriminative LDP (ID-LDP) that is a fine-grained privacy notion and reflects the distinct privacy requirements of different inputs. However, adopting personalized/granular privacy constraints still raises further concerns when considering complex system architectures and ensuring good data utility.…”
Section: Discussion and Future Directionsmentioning
confidence: 99%
See 1 more Smart Citation
“…Some existing work [ 57 , 203 ] aimed to propose personalized LDP-based frameworks for private histogram estimation. Gu et al [ 59 ] presented Input-Discriminative LDP (ID-LDP) that is a fine-grained privacy notion and reflects the distinct privacy requirements of different inputs. However, adopting personalized/granular privacy constraints still raises further concerns when considering complex system architectures and ensuring good data utility.…”
Section: Discussion and Future Directionsmentioning
confidence: 99%
“…However, Gu et al. [ 59 ] further indicated that different data have distinct sensitivity levels. Thus, they presented the Input-Discriminative LDP (ID-LDP) which is a more fine-grained version of LDP.…”
Section: Theoretical Summarization Of Ldpmentioning
confidence: 99%
“…Existing DP-based learning algorithms include local differential privacy (LDP) [20][21][22] and DP-based distributed SGD [23]. In LDP, each client perturbs its information locally and only sends a random version to the server, thereby protecting the client and server from the leakage of private information.…”
Section: Related Workmentioning
confidence: 99%
“…The work in [21] proposed a solution to establish an SGD compliant with the LDP standard, which provides impetus for various important ML tasks. The work in [22] considered the distributed estimation of the data uploaded by the client by the server and used LDP to protect these data. The work in [23] improved the calculation efficiency of DP-based SGD by tracking the detailed information of privacy loss and obtains an accurate estimate of the overall privacy loss.…”
Section: Related Workmentioning
confidence: 99%
“…12a). Results have shown that IDUE(O) achieves higher accuracy than IDUE(R) [18]. However, the effect of the measured value being different from the true value is more significant in the peak dataset.…”
mentioning
confidence: 96%