2023
DOI: 10.1016/j.cose.2023.103147
|View full text |Cite
|
Sign up to set email alerts
|

Towards practical differential privacy in data analysis: Understanding the effect of epsilon on utility in private ERM

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
5
0

Year Published

2023
2023
2025
2025

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(5 citation statements)
references
References 2 publications
0
5
0
Order By: Relevance
“…Furthermore, to guarantee meaningful DP, ϵ is consistently set smaller than 1.0 and δ is always set smaller than 1/n, where n denotes the size of the database [20]. Moreover, smaller privacy budgets decrease the utility of the system, which is in line with the common sense and leads to a tradeoff between the utility and the privacy; corresponding discussions are studied in the field of machine learning [48], data-processing systems [49], open data sharing [50], etc. The tradeoff is also shown by the experimental results given in Section 5.…”
Section: Remarkmentioning
confidence: 99%
“…Furthermore, to guarantee meaningful DP, ϵ is consistently set smaller than 1.0 and δ is always set smaller than 1/n, where n denotes the size of the database [20]. Moreover, smaller privacy budgets decrease the utility of the system, which is in line with the common sense and leads to a tradeoff between the utility and the privacy; corresponding discussions are studied in the field of machine learning [48], data-processing systems [49], open data sharing [50], etc. The tradeoff is also shown by the experimental results given in Section 5.…”
Section: Remarkmentioning
confidence: 99%
“…All three metrics (DM, accuracy, and PD) were used to measure the effectiveness of the proposed approach. For fair comparisons, we prepared anonymized versions of both datasets with varying scales of k. We used both large-scale values (L s , where L s = D k and k = [50,75,100,125,150,175,200,250]), and small-scale values (S s , where S s = D k and k = [5,10,15,20,25,30,35,40]) of k to demonstrate the potency of the proposed approach.…”
Section: Accuracy =mentioning
confidence: 99%
“…DP [24] and its improved versions are mostly used in this mechanism of PPDP. Although DP provides strong privacy guarantees, the utility of the resulting dataset is often low, especially when a small is used [25,26]. Furthermore, the amount of noise injected by the DP model in less frequent parts of data is very high, leading to poor utility in data-driven applications.…”
mentioning
confidence: 99%
“…However, current research shows that calculating a posteriori the achieved privacy level is still important in order to audit the DP application process because hyperparameter tuning is complex and may result in unwanted and unnoticed errors. That is why there are methods to calculate approximate lower and higher limits of the proposed privacy level [101,[125][126][127][128]. For example, Florian Tramèr et al debugged a recently proposed DP training mechanism, proving with ∼99.9% confidence that the proposed method reaches not the proposed ε = 0.21 lower bound privacy level but rather ε > 2.79.…”
mentioning
confidence: 99%
“…With this, the goal is to decrease the necessary time and computing resources in order to find the target privacy-utility trade-off using empirical risk minimization (ERM). However, such research is limited to specific use cases, e.g., objective function perturbation [125,129]. For this reason, implementers could use more abstract guidelines.…”
mentioning
confidence: 99%