Proceedings of the 22nd ACM SIGSAC Conference on Computer and Communications Security 2015
DOI: 10.1145/2810103.2813640
|View full text |Cite
|
Sign up to set email alerts
|

Protecting Locations with Differential Privacy under Temporal Correlations

Abstract: Concerns on location privacy frequently arise with the rapid development of GPS enabled devices and location-based applications. While spatial transformation techniques such as location perturbation or generalization have been studied extensively, most techniques rely on syntactic privacy models without rigorous privacy guarantee. Many of them only consider static scenarios or perturb the location at single timestamps without considering temporal correlations of a moving user's locations, and hence are vulnera… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
282
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 316 publications
(282 citation statements)
references
References 43 publications
0
282
0
Order By: Relevance
“…Liu et al [25] inferred the dependence coefficient, distributed in interval [0, 1], to evaluate the probabilistic correlation between two tuples in a more fine-grained manner, thus reducing the query sensitivity which results in less noise. Considering temporal correlations of a moving user's locations, the work in [26] leveraged a hidden Markov model to establish a location set and proposed a variant of DP to protect location privacy. Wu et al [27] proposed the definition of correlated differential privacy to evaluate the real privacy level of a single dataset influenced by the other datasets when multiple datasets are correlated.…”
Section: Related Workmentioning
confidence: 99%
“…Liu et al [25] inferred the dependence coefficient, distributed in interval [0, 1], to evaluate the probabilistic correlation between two tuples in a more fine-grained manner, thus reducing the query sensitivity which results in less noise. Considering temporal correlations of a moving user's locations, the work in [26] leveraged a hidden Markov model to establish a location set and proposed a variant of DP to protect location privacy. Wu et al [27] proposed the definition of correlated differential privacy to evaluate the real privacy level of a single dataset influenced by the other datasets when multiple datasets are correlated.…”
Section: Related Workmentioning
confidence: 99%
“…Indeed, probably thanks to its efficiency, geoindistinguishability via the Laplacian mechanism has been adopted as the basis or as a component of several tools and frameworks for location privacy, including: Location Guard [10], LP-Guardian [11], LP-Doctor [12], the system for secure nearby-friends discovery in [13], SpatialVision QGIS plugin [14], and it is one of the possible input methods in STAC [15]. Furthermore, the PIM mechanism [16] can be considered an extension of the planar Laplacian to the case of traces (temporally correlated sequences of points): The authors of [16] attack the problem of the degradation of privacy due to correlation by adding Laplacian noise directly to the convex hull of the trace.…”
Section: Introductionmentioning
confidence: 99%
“…Once the adversary has no such side information, the expected distance error is not sufficient for quantifying location privacy. As a result, differential privacy [6,26] that abstract from the adversary's side information has been growing popularity in LBS privacy protection, which measures the ability of the adversary with arbitrary background knowledge to obtain the user's real location. However, as noted in [27], this metric can be problematic if prior is taken into account.…”
Section: Location Privacy Metricsmentioning
confidence: 99%
“…Expected distance error based schemes [4,9] obfuscate user's location by taking the adversary's side information into account, which also suffer from background knowledge attacks. Differential privacy based schemes [6,26] have gained popularity as they abstract from the adversary's side information and are capable of providing strong worst-case privacy guarantees. However, these approaches do not take the contextual information of the user's location and are not sufficient to protect users from reidentification [34].…”
Section: Location Privacy Protectionmentioning
confidence: 99%