2017
DOI: 10.1007/978-3-319-61176-1_12
|View full text |Cite
|
Sign up to set email alerts
|

Privacy-Preserving Outlier Detection for Data Streams

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

0
12
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 8 publications
(12 citation statements)
references
References 17 publications
0
12
0
Order By: Relevance
“…We refer to such privacy notions as Relaxed-DP. In the context of outlier analysis, examples of Relaxed-DP are: anomaly-restricted DP [5], protected DP [15], DP over relaxed sensitivity [6], and sensitive privacy [2]. Note that if the outlier model is data-dependent (i.e.…”
Section: Privacy: the 2nd Constituent Of The Problemmentioning
confidence: 99%
See 1 more Smart Citation
“…We refer to such privacy notions as Relaxed-DP. In the context of outlier analysis, examples of Relaxed-DP are: anomaly-restricted DP [5], protected DP [15], DP over relaxed sensitivity [6], and sensitive privacy [2]. Note that if the outlier model is data-dependent (i.e.…”
Section: Privacy: the 2nd Constituent Of The Problemmentioning
confidence: 99%
“…Let f be the outlier query that to each party outputs its records (in x t ) that are among the topk outliers in x, i.e. k records with the smallest AVF score 6 . Clearly, f is an endo-query, and for x, ℒ t (f) = {i ∈ f(x) : i ∈ x t }.…”
mentioning
confidence: 99%
“…To address the collusion issue in [10], Random Multiparty Perturbation (RMP) technique [12] is proposed to allow each party to use a unique and different perturbation matrix to randomise their data. A recent differential privacy-based work [11] leverages a relaxed version of differential privacy to process the data in data streams. However, the differential privacy-based approaches lead to an accuracy loss in practice, while our PPOD does not degrade the accuracy comparing to the outlier detection algorithm for unencrypted datasets.…”
Section: Related Workmentioning
confidence: 99%
“…Variants of the notion of differential privacy address important practical challenges. In particular, personalized differential privacy [29], protected differential privacy [31], relaxed differential privacy [6], and one-sided differential privacy [13] have a reversed order of quantification compared to sensitive privacy. Sensitive privacy, quantifies sensitive records and their privacy after quantifying the database, which is in contrast to the previous work.…”
Section: Related Workmentioning
confidence: 99%
“…This is not extensible to the case, where anomalies are defined relative to the other records [31]. Similarly, the proposed relaxed DP mechanism [6] is only applicable to anomalies defined in data-independent manner.…”
Section: Related Workmentioning
confidence: 99%