next up previous contents
Next: Using OSInstance Methods: Low Up: The OS Algorithmic Differentiation Previous: The OS Algorithmic Differentiation   Contents


Algorithmic Differentiation: Brief Review

First and second derivative calculations are made using algorithmic differentiation. Here we provide a brief review of algorithmic differentiation. For an excellent reference on algorithmic differentiation see Griewank [3]. The OS package uses the COIN-OR package CppAD which is also an excellent resource with extensive documentation and information about algorithmic differentiation. See the documentation written by Brad Bell [1]. The development here is from the CppAD documentation. Consider the function $ f:X \rightarrow Y$ from $ \mathbb{R}^{n}$ to $ \mathbb{R}^{m}.$ (That is, $ Y = f(X).$)

Express the input vector as a scalar function of $ t$ by

$\displaystyle X(t) = x^{(0)} + x^{(1)} t + x^{(2)} t^{2}$     (11)

where $ x^{(0)},$ $ x^{(1)},$ and $ x^{(2)}$ are vectors in $ \mathbb{R}^{n}$ and $ t$ is a scalar. By judiciously choosing $ x^{(0)}, x^{(1)},$ and $ x^{(2)}$ we will be able to derive many different expressions of interest. Note first that
$\displaystyle X(0)$ $\displaystyle =$ $\displaystyle x^{(0)}$  
$\displaystyle X^{\prime}(0)$ $\displaystyle =$ $\displaystyle x^{(1)}$  
$\displaystyle X^{\prime \prime }(0)$ $\displaystyle =$ $\displaystyle 2 x^{(2)}$  

In general, $ x^{(k)}$ corresponds to the $ kth$ order Taylor coefficient, i.e.
$\displaystyle x^{(k)} = \frac{1}{k!}X^{(k)}(0), \quad k = 0, 1, 2$     (12)

Then $ Y(t) = f(X(t))$ is a function from $ \mathbb{R}^{1}$ to $ \mathbb{R}^{m}$ and it is expressed in terms of its Taylor series expansion as
$\displaystyle Y(t) = y^{(0)} + y^{(1)} t + y^{(2)} t^{2} + o(t^{3})$     (13)

where
$\displaystyle y^{(k)} = \frac{1}{k!} Y^{(k)}(0), \quad k = 0, 1, 2$     (14)

The following are shown in Bell (http://www.coin-or.org/CppAD/).

$\displaystyle y^{(0)} = f(x^{(0)})$     (15)

Let $ e^{(i)}$ denote the $ ith$ unit vector. If $ x^{(1)} = e^{(i)}$ then $ y^{(1)}$ is equal to the $ ith$ column of the Jacobian matrix of $ f(x)$ evaluated at $ x^{(0)}.$ That is
$\displaystyle y^{(1)} = \frac{\partial f}{\partial x_{i}} (x^{(0)}).$     (16)

In addition, if $ x^{(1)} = e^{(i)}$ and $ x^{(2)} = 0$ then for function $ f_{k}(x),$ (the $ kth$ component of $ f$)

$\displaystyle y^{(2)}_{k} = \frac{1}{2} \frac{\partial^2 f_{k}(x^{(0)})}{\partial x_{i} \partial x_{i}}$     (17)

In order to evaluate the mixed partial derivatives, one can instead set $ x^{(1)} = e^{(i)} + e^{(j)}$ and $ x^{(2)} = 0.$ This gives for function $ f_{k}(x),$

$\displaystyle y^{(2)}_{k} = \frac{1}{2} \left( \frac{\partial^2 f_{k}(x^{(0)})}...
..._{i}} + \frac{\partial^2 f_{k}(x^{(0)})}{\partial x_{j} \partial x_{j}} \right)$     (18)

or, expressed in terms of the mixed partials,
$\displaystyle \frac{\partial^2 f_{k}(x^{(0)})}{\partial x_{i} \partial x_{j}} =...
..._{i}} + \frac{\partial^2 f_{k}(x^{(0)})}{\partial x_{j} \partial x_{j}} \right)$     (19)


next up previous contents
Next: Using OSInstance Methods: Low Up: The OS Algorithmic Differentiation Previous: The OS Algorithmic Differentiation   Contents
Kipp Martin 2008-01-16