scipy.optimize.fmin_ncg

scipy.optimize.fmin_ncg(f, x0, fprime, fhess_p=None, fhess=None, args=(), avextol=1e-05, epsilon=1.4901161193847656e-08, maxiter=None, full_output=0, disp=1, retall=0, callback=None)

Unconstrained minimization of a function using the Newton-CG method.

Parameters :

f : callable f(x,*args)

Objective function to be minimized.

x0 : ndarray

Initial guess.

fprime : callable f’(x,*args)

Gradient of f.

fhess_p : callable fhess_p(x,p,*args)

Function which computes the Hessian of f times an arbitrary vector, p.

fhess : callable fhess(x,*args)

Function to compute the Hessian matrix of f.

args : tuple

Extra arguments passed to f, fprime, fhess_p, and fhess (the same set of extra arguments is supplied to all of these functions).

epsilon : float or ndarray

If fhess is approximated, use this value for the step size.

callback : callable

An optional user-supplied function which is called after each iteration. Called as callback(xk), where xk is the current parameter vector.

Returns :

xopt : ndarray

Parameters which minimizer f, i.e. f(xopt) == fopt.

fopt : float

Value of the function at xopt, i.e. fopt = f(xopt).

fcalls : int

Number of function calls made.

gcalls : int

Number of gradient calls made.

hcalls : int

Number of hessian calls made.

warnflag : int

Warnings generated by the algorithm. 1 : Maximum number of iterations exceeded.

allvecs : list

The result at each iteration, if retall is True (see below).

Other Parameters:
 

avextol : float

Convergence is assumed when the average relative error in the minimizer falls below this amount.

maxiter : int

Maximum number of iterations to perform.

full_output : bool

If True, return the optional outputs.

disp : bool

If True, print convergence message.

retall : bool

If True, return a list of results at each iteration.

Notes

Only one of fhess_p or fhess need to be given. If fhess is provided, then fhess_p will be ignored. If neither fhess nor fhess_p is provided, then the hessian product will be approximated using finite differences on fprime. fhess_p must compute the hessian times an arbitrary vector. If it is not given, finite-differences on fprime are used to compute it.

Newton-CG methods are also called truncated Newton methods. This function differs from scipy.optimize.fmin_tnc because

  1. scipy.optimize.fmin_ncg is written purely in python using numpy

    and scipy while scipy.optimize.fmin_tnc calls a C function.

  2. scipy.optimize.fmin_ncg is only for unconstrained minimization

    while scipy.optimize.fmin_tnc is for unconstrained minimization or box constrained minimization. (Box constraints give lower and upper bounds for each variable seperately.)

References

Wright & Nocedal, ‘Numerical Optimization’, 1999, pg. 140.

Previous topic

scipy.optimize.fmin_bfgs

Next topic

scipy.optimize.leastsq

This Page