Home
Scholarly Works
Privacy Amplification of Iterative Algorithms via...
Preprint

Privacy Amplification of Iterative Algorithms via Contraction Coefficients

Abstract

We investigate the framework of privacy amplification by iteration, recently proposed by Feldman et al., from an information-theoretic lens. We demonstrate that differential privacy guarantees of iterative mappings can be determined by a direct application of contraction coefficients derived from strong data processing inequalities for $f$-divergences. In particular, by generalizing the Dobrushin's contraction coefficient for total variation distance to an $f$-divergence known as $E_{\gamma}$-divergence, we derive tighter bounds on the differential privacy parameters of the projected noisy stochastic gradient descent algorithm with hidden intermediate updates.

Authors

Asoodeh S; Diaz M; Calmon FP

Publication date

January 17, 2020

DOI

10.48550/arxiv.2001.06546

Preprint server

arXiv
View published work (Non-McMaster Users)

Contact the Experts team