On the Sample Complexity of Privately Learning Unbounded
High-Dimensional Gaussians
Journal Articles
Overview
Research
View All
Overview
abstract
We provide sample complexity upper bounds for agnostically learning
multivariate Gaussians under the constraint of approximate differential
privacy. These are the first finite sample upper bounds for general Gaussians
which do not impose restrictions on the parameters of the distribution. Our
bounds are near-optimal in the case when the covariance is known to be the
identity, and conjectured to be near-optimal in the general case. From a
technical standpoint, we provide analytic tools for arguing the existence of
global "locally small" covers from local covers of the space. These are
exploited using modifications of recent techniques for differentially private
hypothesis selection. Our techniques may prove useful for privately learning
other distribution classes which do not possess a finite cover.