scipy.optimize.linearmixing¶

scipy.optimize.
linearmixing
(F, xin, iter=None, alpha=None, verbose=False, maxiter=None, f_tol=None, f_rtol=None, x_tol=None, x_rtol=None, tol_norm=None, line_search='armijo', callback=None, **kw)¶ Find a root of a function, using a scalar Jacobian approximation.
Warning
This algorithm may be useful for specific problems, but whether it will work may depend strongly on the problem.
 Parameters
 Ffunction(x) > f
Function whose root to find; should take and return an arraylike object.
 xinarray_like
Initial guess for the solution
 alphafloat, optional
The Jacobian approximation is (1/alpha).
 iterint, optional
Number of iterations to make. If omitted (default), make as many as required to meet tolerances.
 verbosebool, optional
Print status to stdout on every iteration.
 maxiterint, optional
Maximum number of iterations to make. If more are needed to meet convergence, NoConvergence is raised.
 f_tolfloat, optional
Absolute tolerance (in maxnorm) for the residual. If omitted, default is 6e6.
 f_rtolfloat, optional
Relative tolerance for the residual. If omitted, not used.
 x_tolfloat, optional
Absolute minimum step size, as determined from the Jacobian approximation. If the step size is smaller than this, optimization is terminated as successful. If omitted, not used.
 x_rtolfloat, optional
Relative minimum step size. If omitted, not used.
 tol_normfunction(vector) > scalar, optional
Norm to use in convergence check. Default is the maximum norm.
 line_search{None, ‘armijo’ (default), ‘wolfe’}, optional
Which type of a line search to use to determine the step size in the direction given by the Jacobian approximation. Defaults to ‘armijo’.
 callbackfunction, optional
Optional callback function. It is called on every iteration as
callback(x, f)
where x is the current solution and f the corresponding residual.
 Returns
 solndarray
An array (of similar array type as x0) containing the final solution.
 Raises
 NoConvergence
When a solution was not found.