TY - JOUR
T1 - An in-depth examination of requirements for disclosure risk assessment
AU - Jarmin, Ron S.
AU - Abowd, John M.
AU - Ashmead, Robert
AU - Cumings-Menon, Ryan
AU - Goldschlag, Nathan
AU - Hawes, Michael B.
AU - Keller, Sallie Ann
AU - Kifer, Daniel
AU - Leclerc, Philip
AU - Reiter, Jerome P.
AU - Rodríguez, Rolando A.
AU - Schmutte, Ian
AU - Velkoff, Victoria A.
AU - Zhuravlev, Pavel
N1 - Publisher Copyright:
Copyright © 2023 the Author(s). Published by PNAS. This article is distributed under Creative Commons Attribution-NonCommercial-NoDerivatives License 4.0 (CC BY-NC-ND).
PY - 2023
Y1 - 2023
N2 - The use of formal privacy to protect the confidentiality of responses in the 2020 Decennial Census of Population and Housing has triggered renewed interest and debate over how to measure the disclosure risks and societal benefits of the published data products. We argue that any proposal for quantifying disclosure risk should be based on prespecified, objective criteria. We illustrate this approach to evaluate the absolute disclosure risk framework, the counterfactual framework underlying differential privacy, and prior-to-posterior comparisons. We conclude that satisfying all the desiderata is impossible, but counterfactual comparisons satisfy the most while absolute disclosure risk satisfies the fewest. Furthermore, we explain that many of the criticisms levied against differential privacy would be levied against any technology that is not equivalent to direct, unrestricted access to confidential data. More research is needed, but in the near term, the counterfactual approach appears best-suited for privacy versus utility analysis.
AB - The use of formal privacy to protect the confidentiality of responses in the 2020 Decennial Census of Population and Housing has triggered renewed interest and debate over how to measure the disclosure risks and societal benefits of the published data products. We argue that any proposal for quantifying disclosure risk should be based on prespecified, objective criteria. We illustrate this approach to evaluate the absolute disclosure risk framework, the counterfactual framework underlying differential privacy, and prior-to-posterior comparisons. We conclude that satisfying all the desiderata is impossible, but counterfactual comparisons satisfy the most while absolute disclosure risk satisfies the fewest. Furthermore, we explain that many of the criticisms levied against differential privacy would be levied against any technology that is not equivalent to direct, unrestricted access to confidential data. More research is needed, but in the near term, the counterfactual approach appears best-suited for privacy versus utility analysis.
UR - http://www.scopus.com/inward/record.url?scp=85175023378&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85175023378&partnerID=8YFLogxK
U2 - 10.1073/pnas.2220558120
DO - 10.1073/pnas.2220558120
M3 - Article
C2 - 37831744
AN - SCOPUS:85175023378
SN - 0027-8424
VL - 120
JO - Proceedings of the National Academy of Sciences of the United States of America
JF - Proceedings of the National Academy of Sciences of the United States of America
IS - 43
M1 - e2220558120
ER -