2019
DOI: 10.1007/978-3-030-34223-4_31
|View full text |Cite
|
Sign up to set email alerts
|

RDF Graph Anonymization Robust to Data Linkage

Abstract: Privacy is a major concern when publishing new datasets in the context of Linked Open Data (LOD). A new dataset published in the LOD is indeed exposed to privacy breaches due to the linkage to objects already present in the other datasets of the LOD. In this paper, we focus on the problem of building safe anonymizations of an RDF graph to guarantee that linking the anonymized graph with any external RDF graph will not cause privacy breaches. Given a set of privacy queries as input, we study the data-independen… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
5
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
2
2
2

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(5 citation statements)
references
References 19 publications
0
5
0
Order By: Relevance
“…A query-based logical framework for RDF data has been introduced in [7,8], where sensitive information is expressed as SPARQL queries whose results must not disclose sensitive information of individual. It has been extended to handling utility queries in [4,5]. These approaches however are restricted to simple conjunctive queries.…”
Section: Related Workmentioning
confidence: 99%
“…A query-based logical framework for RDF data has been introduced in [7,8], where sensitive information is expressed as SPARQL queries whose results must not disclose sensitive information of individual. It has been extended to handling utility queries in [4,5]. These approaches however are restricted to simple conjunctive queries.…”
Section: Related Workmentioning
confidence: 99%
“…First, instead of making the data public, one can provide only restricted access through queries, whose answers are monitored by a "censor", which may decide not to give an answer or even lie if needed to satisfy the constraints [7][8][9]. Second, one can publish the data in an appropriately anonymized form, while keeping as much information about individuals as is allowed by the policy available [2,4,6,10,13,14].…”
Section: Introductionmentioning
confidence: 99%
“…The papers [2,4,6] and this one allow for arbitrary modifications of the original data set, as long as the new data is logically implied by the original one. In contrast, the work from [10,13,14] restricts modifications to the application of certain anonymization operations. Another distinguishing criterion is which formalisms are employed for representing the data and the policy.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…In particular, this sort of attacks may occur when the combination of the repaired ontology with the background knowledge of the attackers still violates a privacy policy. Indeed, considering such malicious attacks for ontologies in the last step above has also been recently investigated in different contexts, such as in the area of linked data [6,9] or in the area of ontology-based data integration (OBDI) [4]. In the context of privacy in DL ontologies, to the best of our knowledge, the studies of preserving identity or reckoning such linkage attacks were still unexplored, whereas the studies of ontology repairs have been carried out by e.g., [10,16] with different settings and motivations.…”
Section: Introductionmentioning
confidence: 99%