2011
DOI: 10.1137/090756090
|View full text |Cite
|
Sign up to set email alerts
|

What Can We Learn Privately?

Abstract: Learning problems form an important category of computational tasks that generalizes many of the computations researchers apply to large real-life data sets. We ask: what concept classes can be learned privately, namely, by an algorithm whose output does not depend too heavily on any one input or specific training example? More precisely, we investigate learning algorithms that satisfy differential privacy, a notion that provides strong confidentiality guarantees in the contexts where aggregate information is … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
181
0

Year Published

2011
2011
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 615 publications
(182 citation statements)
references
References 37 publications
1
181
0
Order By: Relevance
“…The most important are queries with outputs that cannot naturally be thought of as tuples of real numbers. This includes, e.g., queries that return classifiers (as in [9]), graphs, or synthetic databases.…”
Section: Summary Of Our Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…The most important are queries with outputs that cannot naturally be thought of as tuples of real numbers. This includes, e.g., queries that return classifiers (as in [9]), graphs, or synthetic databases.…”
Section: Summary Of Our Resultsmentioning
confidence: 99%
“…The standard definition of differential privacy is very strong, requiring unconditional privacy guarantees against computationally unbounded adversaries. Despite this fact, there has been a good amount of success in designing differentially private mechanisms for many types of queries and in various settings [1,5,12,2,9].…”
Section: Introductionmentioning
confidence: 99%
“…The concept of local differential privacy was first proposed by S. P. Kasiviswanathan et al in [16]. The authors of [4] proposed a scheme RAPPOR that addresses the problem of longitudinal data collection with local differential privacy.…”
Section: B Local Differential Privacymentioning
confidence: 99%
“…Local differential privacy (LDP) [16] is a rigorous privacy notion in local setting, which provides a more tough privacy guarantee than the traditional differential privacy. It is because the mechanism needn't any trustable third-party.…”
Section: A Local Differential Privacymentioning
confidence: 99%
See 1 more Smart Citation