2017
DOI: 10.48550/arxiv.1709.02753
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Privacy Loss in Apple's Implementation of Differential Privacy on MacOS 10.12

Abstract: In June 2016, Apple made a bold announcement that it will deploy local differential privacy for some of their user data collection in order to ensure privacy of user data, even from Apple [21,23]. The details of Apple's approach remained sparse. Although several patents [17][18][19] have since appeared hinting at the algorithms that may be used to achieve differential privacy, they did not include a precise explanation of the approach taken to privacy parameter

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
83
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 85 publications
(83 citation statements)
references
References 10 publications
0
83
0
Order By: Relevance
“…Thus, this question is linked to the problem of how to establish the value of . Researchers have debated how to choose this value since the introduction of differential privacy, and there have been several proposals [2,9,13,19]. In particular, [13] showed that the privacy protection level by an arbitrary can be infringed by inference attacks, and it proposed a method for setting based on the posterior belief.…”
Section: Related Workmentioning
confidence: 99%
“…Thus, this question is linked to the problem of how to establish the value of . Researchers have debated how to choose this value since the introduction of differential privacy, and there have been several proposals [2,9,13,19]. In particular, [13] showed that the privacy protection level by an arbitrary can be infringed by inference attacks, and it proposed a method for setting based on the posterior belief.…”
Section: Related Workmentioning
confidence: 99%
“…Erlingsson et al 2014), Apple (e.g. Tang et al 2017), Microsoft (e.g. Ding et al 2017), and pressure from regulatory bodies (e.g.…”
Section: Introductionmentioning
confidence: 99%
“…As more data is collected, analyzed, and published by researchers, companies, and government agencies, concerns about the privacy of the participating individuals have become more prominent (Lane et al, 2014). While there have been many methods of statistical disclosure control to combat this problem (Hundepool et al, 2012), differential privacy (DP) (Dwork et al, 2006) has arisen as the state-of-the-art framework for privacy protection, and is currently being implemented by Google (Erlingsson et al, 2014), Apple (Tang et al, 2017), Microsoft (Ding et al, 2017), and the US Census (Abowd, 2018). Differential privacy is based on a notion of plausible deniability, and requires the introduction of additional noise, beyond sampling, into the analysis procedure.…”
Section: Introductionmentioning
confidence: 99%