Home
Scholarly Works
Branch-locking AD techniques for nonsmooth...
Journal article

Branch-locking AD techniques for nonsmooth composite functions and nonsmooth implicit functions

Abstract

A recent nonsmooth vector forward mode of algorithmic differentiation (AD) computes Nesterov's L-derivatives for nonsmooth composite functions; these L-derivatives provide useful sensitivity information to methods for nonsmooth optimization and equation solving. The established reverse AD mode evaluates gradients efficiently for smooth functions, but it does not extend directly to nonsmooth functions. Thus, this article examines branch-locking strategies to harness the benefits of smooth AD techniques even in the nonsmooth case, in order to improve the computational performance of the nonsmooth vector forward AD mode. In these strategies, each nonsmooth elemental function in the original composition is ‘locked’ into an appropriate linear ‘branch’. The original composition is thereby replaced with a smooth variant, which may be subjected to efficient AD techniques for smooth functions such as the reverse AD mode. In order to choose the correct linear branches, we use inexpensive probing steps to ascertain the composite function's local behaviour. A simple implementation in is described, and the developed techniques are extended to nonsmooth local implicit functions and inverse functions.

Authors

Khan KA

Journal

Optimization Methods and Software, Vol. 33, No. 4-6, pp. 1127–1155

Publisher

Taylor & Francis

Publication Date

November 2, 2018

DOI

10.1080/10556788.2017.1341506

ISSN

1055-6788

Contact the Experts team