scipy.optimize.fmin_bfgs

scipy.optimize.fmin_bfgs(f, x0, fprime=None, args=(), gtol=1e-05, norm=inf, epsilon=1.4901161193847656e-08, maxiter=None, full_output=0, disp=1, retall=0, callback=None)

Minimize a function using the BFGS algorithm.

Parameters :

f : callable f(x,*args)

Objective function to be minimized.

x0 : ndarray

Initial guess.

fprime : callable f’(x,*args)

Gradient of f.

args : tuple

Extra arguments passed to f and fprime.

gtol : float

Gradient norm must be less than gtol before succesful termination.

norm : float

Order of norm (Inf is max, -Inf is min)

epsilon : int or ndarray

If fprime is approximated, use this value for the step size.

callback : callable

An optional user-supplied function to call after each iteration. Called as callback(xk), where xk is the current parameter vector.

Returns :

xopt : ndarray

Parameters which minimize f, i.e. f(xopt) == fopt.

fopt : float

Minimum value.

gopt : ndarray

Value of gradient at minimum, f’(xopt), which should be near 0.

Bopt : ndarray

Value of 1/f’‘(xopt), i.e. the inverse hessian matrix.

func_calls : int

Number of function_calls made.

grad_calls : int

Number of gradient calls made.

warnflag : integer

1 : Maximum number of iterations exceeded. 2 : Gradient and/or function calls not changing.

allvecs : list

Results at each iteration. Only returned if retall is True.

Other Parameters:
 

maxiter : int

Maximum number of iterations to perform.

full_output : bool

If True,return fopt, func_calls, grad_calls, and warnflag in addition to xopt.

disp : bool

Print convergence message if True.

retall : bool

Return a list of results at each iteration if True.

Notes

Optimize the function, f, whose gradient is given by fprime using the quasi-Newton method of Broyden, Fletcher, Goldfarb, and Shanno (BFGS)

References

Wright, and Nocedal ‘Numerical Optimization’, 1999, pg. 198.

Previous topic

scipy.optimize.fmin_cg

Next topic

scipy.optimize.fmin_ncg

This Page