Home
Scholarly Works
Benefits of Additive Noise in Composing Classes...
Conference

Benefits of Additive Noise in Composing Classes with Bounded Capacity

Abstract

We observe that given two (compatible) classes of functions F and H with small capacity as measured by their uniform covering numbers, the capacity of the composition class H ◦ F can become prohibitively large or even unbounded. We then show that adding a small amount of Gaussian noise to the output of F before composing it with H can effectively control the capacity of H ◦ F, offering a general recipe for modular design. To prove our results, we define new notions of uniform covering number of random functions with respect to the total variation and Wasserstein distances. We instantiate our results for the case of multi-layer sigmoid neural networks. Preliminary empirical results on MNIST dataset indicate that the amount of noise required to improve over existing uniform bounds can be numerically negligible (i.e., element-wise i.i.d. Gaussian noise with standard deviation 10-240).

Authors

Pour AF; Ashtiani H

Volume

35

Publication Date

January 1, 2022

Conference proceedings

Advances in Neural Information Processing Systems

ISSN

1049-5258

Labels

Fields of Research (FoR)

Contact the Experts team