Home
Scholarly Works
Privately Learning Mixtures of Axis-Aligned...
Journal article

Privately Learning Mixtures of Axis-Aligned Gaussians

Abstract

We consider the problem of learning mixtures of Gaussians under the constraint of approximate differential privacy. We prove that Õ(k2d log3/2(1/δ)/α2ε) samples are sufficient to learn a mixture of k axis-aligned Gaussians in Rd to within total variation distance α while satisfying (ε, δ)-differential privacy. This is the first result for privately learning mixtures of unbounded axis-aligned (or even unbounded univariate) Gaussians. If the covariance matrices of each of the Gaussians is the identity matrix, we show that Õ(kd/α2 + kd log(1/δ)/αε) samples are sufficient. To prove our results, we design a new technique for privately learning mixture distributions. A class of distributions F is said to be list-decodable if there is an algorithm that, given “heavily corrupted” samples from f ∈ F, outputs a list of distributions one of which approximates f. We show that if F is privately list-decodable then we can learn mixtures of distributions in F. Finally, we show axis-aligned Gaussian distributions are privately list-decodable, thereby proving mixtures of such distributions are privately learnable.

Authors

Aden-Ali I; Ashtiani H; Liaw C

Journal

Advances in Neural Information Processing Systems, Vol. 5, , pp. 3925–3938

Publication Date

January 1, 2021

ISSN

1049-5258

Labels

Fields of Research (FoR)

Contact the Experts team