Home
Scholarly Works
Out-of-distributional risk bounds for neural...
Journal article

Out-of-distributional risk bounds for neural operators with applications to the Helmholtz equation

Abstract

Despite their remarkable success in approximating a wide range of operators defined by PDEs, existing neural operators (NOs) do not necessarily perform well for all physics problems. We focus here on high-frequency waves to highlight possible shortcomings. To resolve these, we propose a subfamily of NOs enabling an enhanced empirical approximation of the nonlinear operator mapping wave speed to solution, or boundary values for the Helmholtz equation on a bounded domain. The latter operator is commonly referred to as the “forward” operator in the study of inverse problems, and we propose a hypernetwork version of the subfamily of NOs as a surrogate model. Our methodology draws inspiration from transformers and techniques such as stochastic depth. Experiments reveal certain surprises in the generalization and the relevance of introducing stochastic depth. Our NOs show superior performance as compared with standard NOs, not only for testing within the training distribution but also for out-of-distribution scenarios. To delve into this observation, we obtain a novel out-of-distribution risk bound tailored to Gaussian measures on Banach spaces, relating stochastic depth with the bound. We conclude by offering an in-depth analysis of the Rademacher complexity associated with our modified models and prove an upper bound tied to their stochastic depth that existing NOs do not satisfy.

Authors

Benitez JAL; Furuya T; Faucher F; Kratsios A; Tricoche X; de Hoop MV

Journal

Journal of Computational Physics, Vol. 513, ,

Publisher

Elsevier

Publication Date

September 15, 2024

DOI

10.1016/j.jcp.2024.113168

ISSN

0021-9991

Contact the Experts team