Conference
UNIVERSAL APPROXIMATION UNDER CONSTRAINTS IS POSSIBLE WITH TRANSFORMERS
Abstract
Many practical problems need the output of a machine learning model to satisfy a set of constraints, K. There are, however, no known guarantees that classical neural networks can exactly encode constraints while simultaneously achieving universality. We provide a quantitative constrained universal approximation theorem which guarantees that for any convex or non-convex compact set K and any continuous function f : Rn ! K, there is a …
Authors
Kratsios A; Liu T; Dokmanić I; Zamanlooy B
Publication Date
January 1, 2022
Conference proceedings
Iclr 2022 10th International Conference on Learning Representations