scipy.optimize.minimize(fun, x0, args=(), method=None, jac=None, hess=None, hessp=None, bounds=None, constraints=(), tol=None, callback=None, options=None)

Minimization of scalar function of one or more variables using the BFGS algorithm.

See also

For documentation for the rest of the parameters, see scipy.optimize.minimize


Set to True to print convergence messages.


Maximum number of iterations to perform.


Terminate successfully if gradient norm is less than gtol.


Order of norm (Inf is max, -Inf is min).

epsfloat or ndarray

If jac is None the absolute step size used for numerical approximation of the jacobian via forward differences.

return_allbool, optional

Set to True to return a list of the best solution at each of the iterations.

finite_diff_rel_stepNone or array_like, optional

If jac in [‘2-point’, ‘3-point’, ‘cs’] the relative step size to use for numerical approximation of the jacobian. The absolute step size is computed as h = rel_step * sign(x) * max(1, abs(x)), possibly adjusted to fit into the bounds. For method='3-point' the sign of h is ignored. If None (default) then step is selected automatically.

xrtolfloat, default: 0

Relative tolerance for x. Terminate successfully if step size is less than xk * xrtol where xk is the current parameter vector.