ABSTRACT
Conceptually, in Algorithmic (also known as Automatic) Differentiation (AD) we consider multivariate vector functions F : Rn → Rm mapping a vector of independent inputs x ∈ Rn onto a vector of dependent outputs y ∈ Rm as y = F (x). For the purpose of the current chapter the function F is assumed to be twice continuously differentiable in a neighborhood of all
arguments at which it is supposed to be evaluated. Hence, its Jacobian
∇F (x) = ( ∂yj ∂xi
∈ Rm×n
and Hessian
∇2F (x) = (
∂2yj ∂xi∂xk
∈ Rm×n×n
exist at all of these points. The Hessian ∇2F (x) ≡ H = (hj,i,k)j=1,...,mi,k=1,...,n is a symmetric 3-tensor, that is, hj,i,k = hj,k,i.