scipy.optimize.curve_fit

scipy.optimize.curve_fit(f, xdata, ydata, p0=None, sigma=None, **kw)

Use non-linear least squares to fit a function, f, to data.

Assumes ydata = f(xdata, *params) + eps

Parameters :

f : callable

The model function, f(x, ...). It must take the independent variable as the first argument and the parameters to fit as separate remaining arguments.

xdata : An N-length sequence or an (k,N)-shaped array

for functions with k predictors. The independent variable where the data is measured.

ydata : N-length sequence

The dependent data — nominally f(xdata, ...)

p0 : None, scalar, or M-length sequence

Initial guess for the parameters. If None, then the initial values will all be 1 (if the number of parameters for the function can be determined using introspection, otherwise a ValueError is raised).

sigma : None or N-length sequence

If not None, it represents the standard-deviation of ydata. This vector, if given, will be used as weights in the least-squares problem.

Returns :

popt : array

Optimal values for the parameters so that the sum of the squared error of f(xdata, *popt) - ydata is minimized

pcov : 2d array

The estimated covariance of popt. The diagonals provide the variance of the parameter estimate.

Notes

The algorithm uses the Levenburg-Marquardt algorithm: scipy.optimize.leastsq. Additional keyword arguments are passed directly to that algorithm.

Examples

>>> import numpy as np
>>> from scipy.optimize import curve_fit
>>> def func(x, a, b, c):
...     return a*np.exp(-b*x) + c
>>> x = np.linspace(0,4,50)
>>> y = func(x, 2.5, 1.3, 0.5)
>>> yn = y + 0.2*np.random.normal(size=len(x))
>>> popt, pcov = curve_fit(func, x, yn)

Previous topic

scipy.optimize.brent

Next topic

scipy.optimize.brentq

This Page