2022
DOI: 10.1186/s42400-022-00129-6
|View full text |Cite
|
Sign up to set email alerts
|

Sarve: synthetic data and local differential privacy for private frequency estimation

Abstract: The collection of user attributes by service providers is a double-edged sword. They are instrumental in driving statistical analysis to train more accurate predictive models like recommenders. The analysis of the collected user data includes frequency estimation for categorical attributes. Nonetheless, the users deserve privacy guarantees against inadvertent identity disclosures. Therefore algorithms called frequency oracles were developed to randomize or perturb user attributes and estimate the frequencies o… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
3
2
1

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(4 citation statements)
references
References 82 publications
0
4
0
Order By: Relevance
“…, + (n − 1) log γ (∈ , , n) = ∈. (71) It is difficult to solve (71) algebraically, but it can easily be solved numerically.…”
Section: Evaluation Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…, + (n − 1) log γ (∈ , , n) = ∈. (71) It is difficult to solve (71) algebraically, but it can easily be solved numerically.…”
Section: Evaluation Resultsmentioning
confidence: 99%
“…Many methods have been proposed for estimating a histogram distribution of users' values under.∈-LDP, such as the Randomized Aggregatable Privacy-Preserving Ordinal Response, Sarve, and so on [ 13,71]. Although such methods achieve high accuracy, their techniques cannot be applied to a person-to-person interaction scenario.…”
Section: Related Work On Ldpmentioning
confidence: 99%
“…The concept of DP has been well-established and applied to the concept of FL [218][219][220]. DP has also been utilized for various use cases involving synthetic data [221][222][223][224].…”
Section: Privacymentioning
confidence: 99%
“…Brauneck et al 63 recently reviewed privacy-enhancing technologies (PETs) from a legal standpoint to engage in a thoughtful discussion of how GDPR legislation in the European Union (EU) relates to commonly used PETs including federated learning (FL), secure multiparty computation (SMPC), and differential privacy (DP). DP a concept first proposed in 2006 by Dwork et al 64 , is gaining broad acceptance as a solid, practical, and trustworthy privacy framework and its application has been also explored with synthetic data [65][66][67] . DP is a precise mathematical constraint that ensures the privacy of individual pieces of information in a database while answering queries about the aggregate.…”
Section: Regulatory Blind Spots and Proposalsmentioning
confidence: 99%