TY - GEN

T1 - Calibrating noise to sensitivity in private data analysis

AU - Dwork, Cynthia

AU - McSherry, Frank

AU - Nissim, Kobbi

AU - Smith, Adam

N1 - Copyright:
Copyright 2011 Elsevier B.V., All rights reserved.

PY - 2006

Y1 - 2006

N2 - We continue a line of research initiated in [10, 11] on privacy-preserving statistical databases. Consider a trusted server that holds a database of sensitive information. Given a query function / mapping databases to reals, the so-called true answer is the result of applying / to the database. To protect privacy, the true answer is perturbed by the addition of random noise generated according to a carefully chosen distribution, and this response, the true answer plus noise, is returned to the user. Previous work focused on the case of noisy sums, in which f = ∑ i g(x i), where x i denotes the ith row of the database and g maps database rows to [0, 1]. We extend the study to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f. Roughly speaking, this is the amount that any single argument to f can change its output. The new analysis shows that for several particular applications substantially less noise is needed than was previously understood to be the case. The first step is a very clean characterization of privacy in terms of indistinguishability of transcripts. Additionally, we obtain separation results showing the increased value of interactive sanitization mechanisms over non-interactive.

AB - We continue a line of research initiated in [10, 11] on privacy-preserving statistical databases. Consider a trusted server that holds a database of sensitive information. Given a query function / mapping databases to reals, the so-called true answer is the result of applying / to the database. To protect privacy, the true answer is perturbed by the addition of random noise generated according to a carefully chosen distribution, and this response, the true answer plus noise, is returned to the user. Previous work focused on the case of noisy sums, in which f = ∑ i g(x i), where x i denotes the ith row of the database and g maps database rows to [0, 1]. We extend the study to general functions f, proving that privacy can be preserved by calibrating the standard deviation of the noise according to the sensitivity of the function f. Roughly speaking, this is the amount that any single argument to f can change its output. The new analysis shows that for several particular applications substantially less noise is needed than was previously understood to be the case. The first step is a very clean characterization of privacy in terms of indistinguishability of transcripts. Additionally, we obtain separation results showing the increased value of interactive sanitization mechanisms over non-interactive.

UR - http://www.scopus.com/inward/record.url?scp=33745556605&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=33745556605&partnerID=8YFLogxK

U2 - 10.1007/11681878_14

DO - 10.1007/11681878_14

M3 - Conference contribution

AN - SCOPUS:33745556605

SN - 3540327312

SN - 9783540327318

T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

SP - 265

EP - 284

BT - Theory of Cryptography

T2 - 3rd Theory of Cryptography Conference, TCC 2006

Y2 - 4 March 2006 through 7 March 2006

ER -