Contraction of Locally Differentially Private Mechanisms
Abstract
We investigate the contraction properties of locally differentially private
mechanisms. More specifically, we derive tight upper bounds on the divergence
between $PK$ and $QK$ output distributions of an $\epsilon$-LDP mechanism $K$
in terms of a divergence between the corresponding input distributions $P$ and
$Q$, respectively. Our first main technical result presents a sharp upper bound
on the $\chi^2$-divergence $\chi^2(PK}\|QK)$ in terms of $\chi^2(P\|Q)$ and
$\varepsilon$. We also show that the same result holds for a large family of
divergences, including KL-divergence and squared Hellinger distance. The second
main technical result gives an upper bound on $\chi^2(PK\|QK)$ in terms of
total variation distance $\mathsf{TV}(P, Q)$ and $\epsilon$. We then utilize
these bounds to establish locally private versions of the van Trees inequality,
Le Cam's, Assouad's, and the mutual information methods, which are powerful
tools for bounding minimax estimation risks. These results are shown to lead to
better privacy analyses than the state-of-the-arts in several statistical
problems such as entropy and discrete distribution estimation, non-parametric
density estimation, and hypothesis testing.