Home
Scholarly Works
Strong Data Processing Inequalities for Locally...
Conference

Strong Data Processing Inequalities for Locally Differentially Private Mechanisms

Abstract

We investigate the strong data processing inequalities of locally differentially private mechanisms under a specific f -divergence, namely the Eγ-divergence. More specifically, we characterize an upper bound on the Eγ-divergence between PK and QK, the output distributions of an ε-LDP mechanism K, in terms of the Eγ-divergence between the corresponding input distributions P and Q. Interestingly, the tightest such upper bound in the binary case turns out to have a non-multiplicative form. We then extend our results to derive a tight upper bound for general f-divergences. As an application of our main findings, we derive a lower bound on the locally private Bayesian estimation risk that is tighter than the available divergence-based bound in the literature.

Authors

Zamanlooy B; Asoodeh S

Volume

00

Pagination

pp. 1794-1799

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Publication Date

June 30, 2023

DOI

10.1109/isit54713.2023.10206578

Name of conference

2023 IEEE International Symposium on Information Theory (ISIT)
View published work (Non-McMaster Users)

Contact the Experts team