Abstract
In this paper, we present a Renyi Differentially Private stochastic gradient descent (SGD) algorithm for convex empirical risk minimization. The algorithm uses output perturbation and leverages randomness inside SGD, which creates a “randomized sensitivity”, in order to reduce the amount of noise that is added. One of the benefits of output perturbation is that we can incorporate a periodic averaging step that serves to further reduce sensitivity while improving accuracy (reducing the well-known oscillating behavior of SGD near the optimum). Renyi Differential Privacy can be used to provide (ε,δ)-differential privacy guarantees and hence provide a comparison with prior work. An empirical evaluation demonstrates that the proposed method outperforms prior methods on differentially private ERM.
Original language | English (US) |
---|---|
State | Published - 2020 |
Event | 22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019 - Naha, Japan Duration: Apr 16 2019 → Apr 18 2019 |
Conference
Conference | 22nd International Conference on Artificial Intelligence and Statistics, AISTATS 2019 |
---|---|
Country/Territory | Japan |
City | Naha |
Period | 4/16/19 → 4/18/19 |
All Science Journal Classification (ASJC) codes
- Artificial Intelligence
- Statistics and Probability