Optimization and root finding (scipy.optimize
)¶
SciPy optimize
provides functions for minimizing (or maximizing)
objective functions, possibly subject to constraints. It includes
solvers for nonlinear problems (with support for both local and global
optimization algorithms), linear programing, constrained
and nonlinear least-squares, root finding, and curve fitting.
Common functions and objects, shared across different solvers, are:
|
Show documentation for additional options of optimization solvers. |
Represents the optimization result. |
|
Optimization¶
Scalar functions optimization¶
|
Minimization of scalar function of one variable. |
The minimize_scalar
function supports the following methods:
Local (multivariate) optimization¶
|
Minimization of scalar function of one or more variables. |
The minimize
function supports the following methods:
- minimize(method=’Nelder-Mead’)
- minimize(method=’Powell’)
- minimize(method=’CG’)
- minimize(method=’BFGS’)
- minimize(method=’Newton-CG’)
- minimize(method=’L-BFGS-B’)
- minimize(method=’TNC’)
- minimize(method=’COBYLA’)
- minimize(method=’SLSQP’)
- minimize(method=’trust-constr’)
- minimize(method=’dogleg’)
- minimize(method=’trust-ncg’)
- minimize(method=’trust-krylov’)
- minimize(method=’trust-exact’)
Constraints are passed to minimize
function as a single object or
as a list of objects from the following classes:
|
Nonlinear constraint on the variables. |
|
Linear constraint on the variables. |
Simple bound constraints are handled separately and there is a special class for them:
|
Bounds constraint on the variables. |
Quasi-Newton strategies implementing HessianUpdateStrategy
interface can be used to approximate the Hessian in minimize
function (available only for the ‘trust-constr’ method). Available
quasi-Newton methods implementing this interface are:
|
Broyden-Fletcher-Goldfarb-Shanno (BFGS) Hessian update strategy. |
|
Symmetric-rank-1 Hessian update strategy. |
Global optimization¶
|
Find the global minimum of a function using the basin-hopping algorithm |
|
Minimize a function over a given range by brute force. |
|
Finds the global minimum of a multivariate function. |
|
Finds the global minimum of a function using SHG optimization. |
|
Find the global minimum of a function using Dual Annealing. |
Least-squares and curve fitting¶
Nonlinear least-squares¶
|
Solve a nonlinear least-squares problem with bounds on the variables. |
Linear least-squares¶
|
Solve |
|
Solve a linear least-squares problem with bounds on the variables. |
Root finding¶
Scalar functions¶
|
Find a root of a scalar function. |
|
Find a root of a function in a bracketing interval using Brent’s method. |
|
Find a root of a function in a bracketing interval using Brent’s method with hyperbolic extrapolation. |
|
Find a root of a function in an interval using Ridder’s method. |
|
Find root of a function within an interval using bisection. |
|
Find a zero of a real or complex function using the Newton-Raphson (or secant or Halley’s) method. |
|
Find a zero using TOMS Algorithm 748 method. |
|
Represents the root finding result. |
The root_scalar
function supports the following methods:
The table below lists situations and appropriate methods, along with
asymptotic convergence rates per iteration (and per function evaluation)
for successful convergence to a simple root(*).
Bisection is the slowest of them all, adding one bit of accuracy for each
function evaluation, but is guaranteed to converge.
The other bracketing methods all (eventually) increase the number of accurate
bits by about 50% for every function evaluation.
The derivative-based methods, all built on newton
, can converge quite quickly
if the initial value is close to the root. They can also be applied to
functions defined on (a subset of) the complex plane.
Domain of f |
Bracket? |
Derivatives? |
Solvers |
Convergence |
||
---|---|---|---|---|---|---|
fprime |
fprime2 |
Guaranteed? |
Rate(s)(*) |
|||
R |
Yes |
N/A |
N/A |
|
|
|
R or C |
No |
No |
No |
secant |
No |
1.62 (1.62) |
R or C |
No |
Yes |
No |
newton |
No |
2.00 (1.41) |
R or C |
No |
Yes |
Yes |
halley |
No |
3.00 (1.44) |
See also
scipy.optimize.cython_optimize
– Typed Cython versions of zeros functions
Fixed point finding:
|
Find a fixed point of the function. |
Linear programming¶
|
Linear programming: minimize a linear objective function subject to linear equality and inequality constraints. |
The linprog
function supports the following methods:
The simplex method supports callback functions, such as:
A sample callback function demonstrating the linprog callback interface. |
Assignment problems:
|
Solve the linear sum assignment problem. |
Utilities¶
Finite-difference approximation¶
|
Finite-difference approximation of the gradient of a scalar function. |
|
Check the correctness of a gradient function by comparing it against a (forward) finite-difference approximation of the gradient. |
Line search¶
|
Bracket the minimum of the function. |
|
Find alpha that satisfies strong Wolfe conditions. |
Hessian approximation¶
|
Linear operator for the L-BFGS approximate inverse Hessian. |
Interface for implementing Hessian update strategies. |
Benchmark problems¶
|
The Rosenbrock function. |
|
The derivative (i.e. |
|
The Hessian matrix of the Rosenbrock function. |
|
Product of the Hessian matrix of the Rosenbrock function with a vector. |
Legacy functions¶
The functions below are not recommended for use in new scripts; all of these methods are accessible via a newer, more consistent interfaces, provided by the interfaces above.
Optimization¶
General-purpose multivariate methods:
|
Minimize a function using the downhill simplex algorithm. |
|
Minimize a function using modified Powell’s method. |
|
Minimize a function using a nonlinear conjugate gradient algorithm. |
|
Minimize a function using the BFGS algorithm. |
|
Unconstrained minimization of a function using the Newton-CG method. |
Constrained multivariate methods:
|
Minimize a function func using the L-BFGS-B algorithm. |
|
Minimize a function with variables subject to bounds, using gradient information in a truncated Newton algorithm. |
|
Minimize a function using the Constrained Optimization By Linear Approximation (COBYLA) method. |
|
Minimize a function using Sequential Least Squares Programming |
Univariate (scalar) minimization methods:
|
Bounded minimization for scalar functions. |
|
Given a function of one variable and a possible bracket, return the local minimum of the function isolated to a fractional precision of tol. |
|
Return the minimum of a function of one variable using golden section method. |
Least-squares¶
|
Minimize the sum of squares of a set of equations. |
Root finding¶
General nonlinear solvers:
|
Find the roots of a function. |
|
Find a root of a function, using Broyden’s first Jacobian approximation. |
|
Find a root of a function, using Broyden’s second Jacobian approximation. |
Large-scale nonlinear solvers:
|
Find a root of a function, using Krylov approximation for inverse Jacobian. |
|
Find a root of a function, using (extended) Anderson mixing. |
Simple iteration solvers:
|
Find a root of a function, using a tuned diagonal Jacobian approximation. |
|
Find a root of a function, using a scalar Jacobian approximation. |
|
Find a root of a function, using diagonal Broyden Jacobian approximation. |