2023
DOI: 10.1016/j.jksuci.2022.11.017
|View full text |Cite
|
Sign up to set email alerts
|

Differentially private block coordinate descent

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 20 publications
0
2
0
Order By: Relevance
“…This intuitively limits the amount of information that model is learning from any given example. Then noise is added, so the sample is from Gaussian distribution [96] and standard deviation of C sigma. As shown in the algorithm, high parameters C and sigma can be tuned to give epsilon delta guarantees for each step of gradient descent.…”
Section: Fundamental Law Of Information Recoverymentioning
confidence: 99%
See 1 more Smart Citation
“…This intuitively limits the amount of information that model is learning from any given example. Then noise is added, so the sample is from Gaussian distribution [96] and standard deviation of C sigma. As shown in the algorithm, high parameters C and sigma can be tuned to give epsilon delta guarantees for each step of gradient descent.…”
Section: Fundamental Law Of Information Recoverymentioning
confidence: 99%
“…In order to track the overall gradient descent, this article studied the papers [96], [97] that describes moments accountant [95] for tracking privacy budget. First they looked at naïve analysis of adding privacy budgets [95].…”
Section: Fundamental Law Of Information Recoverymentioning
confidence: 99%