2011
DOI: 10.1007/978-3-642-27189-2_16
|View full text |Cite
|
Sign up to set email alerts
|

An Information-Theoretic Privacy Criterion for Query Forgery in Information Retrieval

Abstract: Abstract. In previous work, we presented a novel information-theoretic privacy criterion for query forgery in the domain of information retrieval. Our criterion measured privacy risk as a divergence between the user's and the population's query distribution, and contemplated the entropy of the user's distribution as a particular case. In this work, we make a twofold contribution. First, we thoroughly interpret and justify the privacy metric proposed in our previous work, elaborating on the intimate connection … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
21
0

Year Published

2014
2014
2020
2020

Publication Types

Select...
5
2
1
1

Relationship

4
5

Authors

Journals

citations
Cited by 14 publications
(21 citation statements)
references
References 28 publications
0
21
0
Order By: Relevance
“…Perturbation based techniques obfuscate user's interactions history by adding fake items and ratings to it. Rebollo et al [41] propose an information theoretic based privacy metric and then find the obfuscation rate for generating forged user profiles so that the privacy risk is minimized. Similarly, [37] proposes to add or remove items and ratings from user profiles minimize privacy risk.…”
Section: Related Workmentioning
confidence: 99%
“…Perturbation based techniques obfuscate user's interactions history by adding fake items and ratings to it. Rebollo et al [41] propose an information theoretic based privacy metric and then find the obfuscation rate for generating forged user profiles so that the privacy risk is minimized. Similarly, [37] proposes to add or remove items and ratings from user profiles minimize privacy risk.…”
Section: Related Workmentioning
confidence: 99%
“…Following this strategy, no third parties or external entities need to be trusted by the users to preserve their privacy. Existing approaches use different techniques and mechanisms and could be categorized mainly into three categories: cryptographic-based techniques [6,21,40,74,172], differential privacy-based approaches [66,76,89,120,128,130,166,197,198], and perturbation-based techniques [75,118,146,147,148,151,153,158,183] A group of works focus on providing cryptographic solutions to the problem of secure recommender systems. The approaches do not let the single trusted party have access to everyone's data [6,21,40,74,172].…”
Section: Recommendation Systems and Privacymentioning
confidence: 99%
“…Perturbation-based techniques usually obfuscate users item ratings by adding random noise to the user data. Rebollo et al [158] propose an approach that first measures the user's privacy risk as the KL divergence [48] between user's apparent profile and average population's distribution profile. The idea is that the more a user's profile diverges from the general population, the more information an attacker can learn about her.…”
Section: Perturbation-based Solutionsmentioning
confidence: 99%
“…In the absence of a specific statistical model on the frequency distribution of user profiles, as argued extensively in [9,11,26] on the basis of Jaynes' rationale for maximum entropy methods, we assume that anonymity risk may be adequately measured as the Kullback-Leibler (KL) divergence D(p q) between the user profile p and the population's q. The idea is that user profiles become less common as they diverge from the average of the population.…”
Section: An Information-theoretic Model For Measuring Anonymity Riskmentioning
confidence: 99%