Individual differential privacy: A utility-preserving formulation of differential privacy guarantees

dc.contributor
Universitat Oberta de Catalunya. Internet Interdisciplinary Institute (IN3)
dc.contributor
Universitat Rovira i Virgili
dc.contributor.author
Soria Comas, Jordi
dc.contributor.author
Domingo-Ferrer, Josep
dc.contributor.author
Sánchez Ruenes, David
dc.contributor.author
Megías Jiménez, David
dc.date
2018-07-03T10:10:14Z
dc.date
2018-07-03T10:10:14Z
dc.date
2016-07-13
dc.identifier.citation
Soria-Comas, J., Domingo Ferrer, J., Sánchez Ruenes, D. & Megías, D. (2017). Individual Differential Privacy: A Utility-Preserving Formulation of Differential Privacy Guarantees. IEEE Transactions on Information Forensics and Security, 12(6), 1418-1429. doi: 10.1109/TIFS.2017.2663337
dc.identifier.citation
1556-6013
dc.identifier.citation
1556-6021
dc.identifier.citation
10.1109/TIFS.2017.2663337
dc.identifier.uri
http://hdl.handle.net/10609/82345
dc.description.abstract
Differential privacy is a popular privacy model within the research community because of the strong privacy guarantee it offers, namely that the presence or absence of any individual in a data set does not significantly influence the results of analyses on the data set. However, enforcing this strict guarantee in practice significantly distorts data and/or limits data uses, thus diminishing the analytical utility of the differentially private results. In an attempt to address this shortcoming, several relaxations of differential privacy have been proposed that trade off privacy guarantees for improved data utility. In this paper, we argue that the standard formalization of differential privacy is stricter than required by the intuitive privacy guarantee it seeks. In particular, the standard formalization requires indistinguishability of results between any pair of neighbor data sets, while indistinguishability between the actual data set and its neighbor data sets should be enough. This limits the data controller's ability to adjust the level of protection to the actual data, hence resulting in significant accuracy loss. In this respect, we propose individual differential privacy, an alternative differential privacy notion that offers the same privacy guarantees as standard differential privacy to individuals (even though not to groups of individuals). This new notion allows the data controller to adjust the distortion to the actual data set, which results in less distortion and more analytical accuracy. We propose several mechanisms to attain individual differential privacy and we compare the new notion against standard differential privacy in terms of the accuracy of the analytical results.
dc.format
application/pdf
dc.language.iso
eng
dc.publisher
IEEE Transactions on Information Forensics and Security
dc.relation
IEEE Transactions on Information Forensics and Security, 2017, 12(6)
dc.relation
https://ieeexplore.ieee.org/document/7839941/
dc.rights
CC BY-NC-ND
dc.rights
info:eu-repo/semantics/openAccess
dc.rights
<a href="http://creativecommons.org/licenses/by-nc-nd/3.0/es/">http://creativecommons.org/licenses/by-nc-nd/3.0/es/</a>
dc.subject
data privacy
dc.subject
data utility
dc.subject
differential privacy
dc.subject
privacitat de dades
dc.subject
utilitat de dades
dc.subject
privacitat diferencial
dc.subject
privacidad de datos
dc.subject
utilidad de datos
dc.subject
privacidad diferencial
dc.subject
Data protection
dc.subject
Protecció de dades
dc.subject
Protección de datos
dc.title
Individual differential privacy: A utility-preserving formulation of differential privacy guarantees
dc.type
info:eu-repo/semantics/article
dc.type
info:eu-repo/semantics/submittedVersion


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Articles [361]