2021
DOI: 10.1016/j.neucom.2021.07.001
|View full text |Cite
|
Sign up to set email alerts
|

Differentially private empirical risk minimization for AUC maximization

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

6
147
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 75 publications
(165 citation statements)
references
References 12 publications
6
147
0
Order By: Relevance
“…In this regime, other algorithms (e.g. gradient descent output perturbation [ZZMW17], noisy SVRG [WYX17]) that use the alternative form of Gaussian noise are not even differentially private in general, as shown in [ZWB + 19, Theorem 1]. For ǫ ≤ 1, the condition √ ǫ c δ automatically holds for any δ ∈ 0, 1 2 .…”
Section: Excess Risk Bounds For Strongly Convex Lipschitz Functionsmentioning
confidence: 99%
See 1 more Smart Citation
“…In this regime, other algorithms (e.g. gradient descent output perturbation [ZZMW17], noisy SVRG [WYX17]) that use the alternative form of Gaussian noise are not even differentially private in general, as shown in [ZWB + 19, Theorem 1]. For ǫ ≤ 1, the condition √ ǫ c δ automatically holds for any δ ∈ 0, 1 2 .…”
Section: Excess Risk Bounds For Strongly Convex Lipschitz Functionsmentioning
confidence: 99%
“…In the case of empirical risk minimization (ERM), where F (w, X) = 1 n n i=1 f (w, x i ), algorithms for maintaining differential privacy while solving Equation (1) are well studied [CMS11,BST14,ZZMW17,WYX17]. Here (and throughout) X = (x 1 • • • , x n ) is a data set with observations in some set X ⊆ R q .…”
mentioning
confidence: 99%
“…However, getting the optimal bounds with small gradient complexity for non-smooth case turns out to be a more difficult problem. This was noted by [WYX17], who raised it as an important open problem. This question was answered in [BFTT19], who gave an algorithm with almost optimal excess empirical risk.…”
Section: Introductionmentioning
confidence: 96%
“…In the gradient perturbation approach, we add noise to the first order information using optimization algorithms such as Stochastic Gradient Descent (SGD). This approach was first proposed in [BST14] and was later extended by [TTZ14,WYX17], and has lead to the state-of-the-art theoretical bounds for DP-ERM. For an experimental comparison of various approaches to solving DP-ERM we refer the readers to [RBHT09, INS + 19].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation