Computing AD-compatible subgradients of convex relaxations of implicit
functions
Abstract
Automatic generation of convex relaxations and subgradients is critical in
global optimization, and is typically carried out using variants of
automatic/algorithmic differentiation (AD). At previous AD conferences,
variants of the forward and reverse AD modes were presented to evaluate
accurate subgradients for convex relaxations of supplied composite functions.
In a recent approach for generating convex relaxations of implicit functions,
these relaxations are constructed as optimal-value functions; this formulation
is versatile but complicates sensitivity analysis. We present the first
subgradient propagation rules for these implicit function relaxations, based on
supplied AD-like knowledge of the residual function. Our new subgradient rules
allow implicit function relaxations to be added to the elemental function
libraries for the forward AD modes for subgradient propagation of convex
relaxations. Proof-of-concept numerical results in Julia are presented.