Conference
Differentially Private Federated Learning: An Information-Theoretic Perspective
Abstract
We propose a new technique for deriving the differential privacy parameters in federated learning (FL). We consider the setting where a machine learning model is iteratively trained using stochastic gradient descent (SGD) and only the last update is publicly released. In this approach, we interpret each training iteration as a Markov kernel. We then quantify the impact of the kernel on privacy parameters via the contraction coefficient of the …
Authors
Asoodeh S; Chen W-N; Calmon FP; Özgür A
Volume
00
Pagination
pp. 344-349
Publisher
Institute of Electrical and Electronics Engineers (IEEE)
Publication Date
July 20, 2021
DOI
10.1109/isit45174.2021.9518124
Name of conference
2021 IEEE International Symposium on Information Theory (ISIT)