DIFFERENTIALLY PRIVATE CONFIDENCE INTERVALS FOR EMPIRICAL RISK MINIMIZATION

Research output: Contribution to journalArticlepeer-review

16 Scopus citations

Abstract

. The process of data mining with differential privacy produces results that are affected by two types of noise: sampling noise due to data collection and privacy noise that is designed to prevent the reconstruction of sensitive information. In this paper, we consider the problem of designing confidence intervals for the parameters of a variety of differentially private machine learning models. The algorithms can provide confidence intervals that satisfy differential privacy (as well as the more recently proposed concentrated differential privacy) and can be used with existing differentially private mechanisms that train models using objective perturbation and output perturbation.

Original languageEnglish (US)
JournalJournal of Privacy and Confidentiality
Volume9
Issue number1 Special Issue
DOIs
StatePublished - Mar 31 2019

All Science Journal Classification (ASJC) codes

  • Computer Science (miscellaneous)
  • Statistics and Probability
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'DIFFERENTIALLY PRIVATE CONFIDENCE INTERVALS FOR EMPIRICAL RISK MINIMIZATION'. Together they form a unique fingerprint.

Cite this