RDF graph anonymization robust to data linkage - Archive ouverte HAL
Conference Papers Year : 2019

RDF graph anonymization robust to data linkage

Abstract

Privacy is a major concern when publishing new datasets in the context of Linked Open Data (LOD). A new dataset published in the LOD is indeed exposed to privacy breaches due to the linkage to objects already present in the other datasets of the LOD. In this paper, we focus on the problem of building safe anonymizations of an RDF graph to guarantee that linking the anonymized graph with any external RDF graph will not cause privacy breaches. Given a set of privacy queries as input, we study the data-independent safety problem and the sequence of anonymization operations necessary to enforce it. We provide sufficient conditions under which an anonymization instance is safe given a set of privacy queries. Additionally, we show that our algorithms for RDF data anonymization are robust in the presence of sameAs links that can be explicit or inferred by additional knowledge.
Fichier principal
Vignette du fichier
paper.pdf (523.55 Ko) Télécharger le fichier
Origin Files produced by the author(s)
Loading...

Dates and versions

hal-02444752 , version 1 (19-01-2020)

Identifiers

Cite

Rémy Delanaux, Angela Bonifati, Marie-Christine Rousset, Romuald Thion. RDF graph anonymization robust to data linkage. WISE 2019 - 20th International Conference on Web Information Systems Engineering, Jan 2020, Hong Kong, China. pp.491-506, ⟨10.1007/978-3-030-34223-4_31⟩. ⟨hal-02444752⟩
319 View
363 Download

Altmetric

Share

More