TY - GEN
T1 - Max-Information, Differential Privacy, and Post-selection Hypothesis Testing
AU - Rogers, Ryan
AU - Roth, Aaron
AU - Smith, Adam
AU - Thakkar, Om
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2016/12/14
Y1 - 2016/12/14
N2 - In this paper, we initiate a principled study of how the generalization properties of approximate differential privacy can be used to perform adaptive hypothesis testing, while giving statistically valid p-value corrections. We do this by observing that the guarantees of algorithms with bounded approximate max-information are sufficient to correct the p-values of adaptively chosen hypotheses, and then by proving that algorithms that satisfy (ϵ,δ)-differential privacy have bounded approximate max information when their inputs are drawn from a product distribution. This substantially extends the known connection between differential privacy and max-information, which previously was only known to hold for (pure) (ϵ,0)-differential privacy. It also extends our understanding of max-information as a partially unifying measure controlling the generalization properties of adaptive data analyses. We also show a lower bound, proving that (despite the strong composition properties of max-information), when data is drawn from a product distribution, (ϵ,δ)-differentially private algorithms can come first in a composition with other algorithms satisfying max-information bounds, but not necessarily second if the composition is required to itself satisfy a nontrivial max-information bound. This, in particular, implies that the connection between (ϵ,δ)-differential privacy and max-information holds only for inputs drawn from product distributions, unlike the connection between (ϵ,0)-differential privacy and max-information.
AB - In this paper, we initiate a principled study of how the generalization properties of approximate differential privacy can be used to perform adaptive hypothesis testing, while giving statistically valid p-value corrections. We do this by observing that the guarantees of algorithms with bounded approximate max-information are sufficient to correct the p-values of adaptively chosen hypotheses, and then by proving that algorithms that satisfy (ϵ,δ)-differential privacy have bounded approximate max information when their inputs are drawn from a product distribution. This substantially extends the known connection between differential privacy and max-information, which previously was only known to hold for (pure) (ϵ,0)-differential privacy. It also extends our understanding of max-information as a partially unifying measure controlling the generalization properties of adaptive data analyses. We also show a lower bound, proving that (despite the strong composition properties of max-information), when data is drawn from a product distribution, (ϵ,δ)-differentially private algorithms can come first in a composition with other algorithms satisfying max-information bounds, but not necessarily second if the composition is required to itself satisfy a nontrivial max-information bound. This, in particular, implies that the connection between (ϵ,δ)-differential privacy and max-information holds only for inputs drawn from product distributions, unlike the connection between (ϵ,0)-differential privacy and max-information.
UR - http://www.scopus.com/inward/record.url?scp=85009347998&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85009347998&partnerID=8YFLogxK
U2 - 10.1109/FOCS.2016.59
DO - 10.1109/FOCS.2016.59
M3 - Conference contribution
AN - SCOPUS:85009347998
T3 - Proceedings - Annual IEEE Symposium on Foundations of Computer Science, FOCS
SP - 487
EP - 494
BT - Proceedings - 57th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2016
PB - IEEE Computer Society
T2 - 57th Annual IEEE Symposium on Foundations of Computer Science, FOCS 2016
Y2 - 9 October 2016 through 11 October 2016
ER -