next up previous contents
Next: Hessian Evaluation Methods Up: Using OSInstance Methods: High Previous: Function Evaluation Methods   Contents

Gradient Evaluation Methods

One OSInstance method for gradient calculations is

SparseJacobianMatrix *calculateAllConstraintFunctionGradients(double* x, double *objLambda,
     double *conLambda, bool new_x, int highestOrder)
If a call has been placed to calculateAllConstraintFunctionValues with highestOrder = 0, then the appropriate call to get gradient evaluations is
calculateAllConstraintFunctionGradients( x, NULL, NULL,  false, 1);
Note that in this function call newFx = false. This prevents a call to forwardAD() with order 0 to get the function values.

If, at the current iterate, the Hessian of the Lagrangian function is also desired then an appropriate call is

calculateAllConstraintFunctionGradients(x, objLambda, conLambda, false, 2);
In this case, if there was a prior call
calculateAllConstraintFunctionValues(x, w, z,  true, 0);
then only first and second derivatives are calculated, not function values.

When calculating the gradients, if the number of nonlinear variables exceeds or is equal to the number of rows, a forwardAD(0, x) sweep is used to get the function values, and a reverseAD(1, $ e^{k}$) sweep for each unit vector $ e^{k}$ in the row space is used to get the vector of first order partials for each row in the constraint Jacobian. If the number of nonlinear variables is less then the number of rows then a forwardAD(0, x) sweep is used to get the function values and a forwardAD(1, $ e^{i}$) sweep for each unit vector $ e^{i}$ in the column space is used to get the vector of first order partials for each column in the constraint Jacobian.

Two other gradient methods are

SparseVector *calculateConstraintFunctionGradient(double* x,
    double *objLambda, double *conLambda,  int idx, bool new_x, int highestOrder);
and
SparseVector *calculateConstraintFunctionGradient(double* x, int idx,
    bool new_x );

Similar methods are available for the objective function; however the objective function gradient methods treat the gradient of each objective function as a dense vector.


next up previous contents
Next: Hessian Evaluation Methods Up: Using OSInstance Methods: High Previous: Function Evaluation Methods   Contents
Kipp Martin 2008-01-16