SaTC: CORE: Medium: Developing for Differential Privacy with Formal Methods and Counterexamples

Project: Research project

Project Details

Description

Differential privacy is an elegant formulation of conditions an algorithm must meet in order to avoid leaking personal information contained in its inputs. It is becoming mainstream in many research communities and has been deployed in practice in the private sector and some government agencies. Accompanying this growth is an unfortunate, but expected, difficulty: many algorithms that are claimed to be differentially private are incorrect. This phenomenon affects both new-comers and seasoned veterans of the differential privacy field because of the difficulty and subtlety of developing new differentially private algorithms.

This proposal outlines a research plan for the development of a system called DevDP, whose purpose is to enable novice and expert users to develop prototypes and explore differentially private algorithms. In particular, the project will develop program analysis tools and theory that leverage both programming language and machine learning technology to aid the development of correct differentially private programs by automating much of the verification and reasoning about errors. As part of broader impacts, DevDP also has the potential to help educate students and less-technical members of the scientific community by providing interactive software tools. A solid understanding of differential privacy will become crucial as it makes its way into public policy.

StatusFinished
Effective start/end date7/1/176/30/23

Funding

  • National Science Foundation: $1,200,000.00

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.