Optimization and root finding (scipy.optimize
)¶
Optimization¶
Local Optimization¶
minimize (fun, x0[, args, method, jac, hess, …]) |
Minimization of scalar function of one or more variables. |
minimize_scalar (fun[, bracket, bounds, …]) |
Minimization of scalar function of one variable. |
OptimizeResult |
Represents the optimization result. |
OptimizeWarning |
The minimize
function supports the following methods:
- minimize(method=’Nelder-Mead’)
- minimize(method=’Powell’)
- minimize(method=’CG’)
- minimize(method=’BFGS’)
- minimize(method=’Newton-CG’)
- minimize(method=’L-BFGS-B’)
- minimize(method=’TNC’)
- minimize(method=’COBYLA’)
- minimize(method=’SLSQP’)
- minimize(method=’trust-constr’)
- minimize(method=’dogleg’)
- minimize(method=’trust-ncg’)
- minimize(method=’trust-krylov’)
- minimize(method=’trust-exact’)
Constraints are passed to minimize
function as a single object or
as a list of objects from the following classes:
NonlinearConstraint (fun, lb, ub[, jac, …]) |
Nonlinear constraint on the variables. |
LinearConstraint (A, lb, ub[, keep_feasible]) |
Linear constraint on the variables. |
Simple bound constraints are handled separately and there is a special class for them:
Bounds (lb, ub[, keep_feasible]) |
Bounds constraint on the variables. |
Quasi-Newton strategies implementing HessianUpdateStrategy
interface can be used to approximate the Hessian in minimize
function (available only for the ‘trust-constr’ method). Available
quasi-Newton methods implementing this interface are:
BFGS ([exception_strategy, min_curvature, …]) |
Broyden-Fletcher-Goldfarb-Shanno (BFGS) Hessian update strategy. |
SR1 ([min_denominator, init_scale]) |
Symmetric-rank-1 Hessian update strategy. |
The minimize_scalar
function supports the following methods:
The specific optimization method interfaces below in this subsection are not recommended for use in new scripts; all of these methods are accessible via a newer, more consistent interface provided by the functions above.
General-purpose multivariate methods:
fmin (func, x0[, args, xtol, ftol, maxiter, …]) |
Minimize a function using the downhill simplex algorithm. |
fmin_powell (func, x0[, args, xtol, ftol, …]) |
Minimize a function using modified Powell’s method. |
fmin_cg (f, x0[, fprime, args, gtol, norm, …]) |
Minimize a function using a nonlinear conjugate gradient algorithm. |
fmin_bfgs (f, x0[, fprime, args, gtol, norm, …]) |
Minimize a function using the BFGS algorithm. |
fmin_ncg (f, x0, fprime[, fhess_p, fhess, …]) |
Unconstrained minimization of a function using the Newton-CG method. |
Constrained multivariate methods:
fmin_l_bfgs_b (func, x0[, fprime, args, …]) |
Minimize a function func using the L-BFGS-B algorithm. |
fmin_tnc (func, x0[, fprime, args, …]) |
Minimize a function with variables subject to bounds, using gradient information in a truncated Newton algorithm. |
fmin_cobyla (func, x0, cons[, args, …]) |
Minimize a function using the Constrained Optimization BY Linear Approximation (COBYLA) method. |
fmin_slsqp (func, x0[, eqcons, f_eqcons, …]) |
Minimize a function using Sequential Least SQuares Programming |
differential_evolution (func, bounds[, args, …]) |
Finds the global minimum of a multivariate function. |
Univariate (scalar) minimization methods:
fminbound (func, x1, x2[, args, xtol, …]) |
Bounded minimization for scalar functions. |
brent (func[, args, brack, tol, full_output, …]) |
Given a function of one-variable and a possible bracket, return the local minimum of the function isolated to a fractional precision of tol. |
golden (func[, args, brack, tol, …]) |
Return the minimum of a function of one variable using golden section method. |
Equation (Local) Minimizers¶
leastsq (func, x0[, args, Dfun, full_output, …]) |
Minimize the sum of squares of a set of equations. |
least_squares (fun, x0[, jac, bounds, …]) |
Solve a nonlinear least-squares problem with bounds on the variables. |
nnls (A, b[, maxiter]) |
Solve argmin_x || Ax - b ||_2 for x>=0 . |
lsq_linear (A, b[, bounds, method, tol, …]) |
Solve a linear least-squares problem with bounds on the variables. |
Global Optimization¶
basinhopping (func, x0[, niter, T, stepsize, …]) |
Find the global minimum of a function using the basin-hopping algorithm |
brute (func, ranges[, args, Ns, full_output, …]) |
Minimize a function over a given range by brute force. |
differential_evolution (func, bounds[, args, …]) |
Finds the global minimum of a multivariate function. |
Rosenbrock function¶
rosen (x) |
The Rosenbrock function. |
rosen_der (x) |
The derivative (i.e. |
rosen_hess (x) |
The Hessian matrix of the Rosenbrock function. |
rosen_hess_prod (x, p) |
Product of the Hessian matrix of the Rosenbrock function with a vector. |
Fitting¶
curve_fit (f, xdata, ydata[, p0, sigma, …]) |
Use non-linear least squares to fit a function, f, to data. |
Root finding¶
Scalar functions¶
brentq (f, a, b[, args, xtol, rtol, maxiter, …]) |
Find a root of a function in a bracketing interval using Brent’s method. |
brenth (f, a, b[, args, xtol, rtol, maxiter, …]) |
Find root of f in [a,b]. |
ridder (f, a, b[, args, xtol, rtol, maxiter, …]) |
Find a root of a function in an interval. |
bisect (f, a, b[, args, xtol, rtol, maxiter, …]) |
Find root of a function within an interval. |
newton (func, x0[, fprime, args, tol, …]) |
Find a zero using the Newton-Raphson or secant method. |
Fixed point finding:
fixed_point (func, x0[, args, xtol, maxiter, …]) |
Find a fixed point of the function. |
Multidimensional¶
General nonlinear solvers:
root (fun, x0[, args, method, jac, tol, …]) |
Find a root of a vector function. |
fsolve (func, x0[, args, fprime, …]) |
Find the roots of a function. |
broyden1 (F, xin[, iter, alpha, …]) |
Find a root of a function, using Broyden’s first Jacobian approximation. |
broyden2 (F, xin[, iter, alpha, …]) |
Find a root of a function, using Broyden’s second Jacobian approximation. |
The root
function supports the following methods:
Large-scale nonlinear solvers:
newton_krylov (F, xin[, iter, rdiff, method, …]) |
Find a root of a function, using Krylov approximation for inverse Jacobian. |
anderson (F, xin[, iter, alpha, w0, M, …]) |
Find a root of a function, using (extended) Anderson mixing. |
Simple iterations:
excitingmixing (F, xin[, iter, alpha, …]) |
Find a root of a function, using a tuned diagonal Jacobian approximation. |
linearmixing (F, xin[, iter, alpha, verbose, …]) |
Find a root of a function, using a scalar Jacobian approximation. |
diagbroyden (F, xin[, iter, alpha, verbose, …]) |
Find a root of a function, using diagonal Broyden Jacobian approximation. |
Linear Programming¶
General linear programming solver:
linprog (c[, A_ub, b_ub, A_eq, b_eq, bounds, …]) |
Minimize a linear objective function subject to linear equality and inequality constraints. |
The linprog
function supports the following methods:
The simplex method supports callback functions, such as:
linprog_verbose_callback (xk, **kwargs) |
A sample callback function demonstrating the linprog callback interface. |
Assignment problems:
linear_sum_assignment (cost_matrix) |
Solve the linear sum assignment problem. |
Utilities¶
approx_fprime (xk, f, epsilon, *args) |
Finite-difference approximation of the gradient of a scalar function. |
bracket (func[, xa, xb, args, grow_limit, …]) |
Bracket the minimum of the function. |
check_grad (func, grad, x0, *args, **kwargs) |
Check the correctness of a gradient function by comparing it against a (forward) finite-difference approximation of the gradient. |
line_search (f, myfprime, xk, pk[, gfk, …]) |
Find alpha that satisfies strong Wolfe conditions. |
show_options ([solver, method, disp]) |
Show documentation for additional options of optimization solvers. |
LbfgsInvHessProduct (sk, yk) |
Linear operator for the L-BFGS approximate inverse Hessian. |
HessianUpdateStrategy |
Interface for implementing Hessian update strategies. |