SciPy

scipy.optimize.check_grad

scipy.optimize.check_grad(func, grad, x0, *args, **kwargs)[source]

Check the correctness of a gradient function by comparing it against a (forward) finite-difference approximation of the gradient.

Parameters
funccallable func(x0, *args)

Function whose derivative is to be checked.

gradcallable grad(x0, *args)

Gradient of func.

x0ndarray

Points to check grad against forward difference approximation of grad using func.

args\*args, optional

Extra arguments passed to func and grad.

epsilonfloat, optional

Step size used for the finite difference approximation. It defaults to sqrt(numpy.finfo(float).eps), which is approximately 1.49e-08.

Returns
errfloat

The square root of the sum of squares (i.e. the 2-norm) of the difference between grad(x0, *args) and the finite difference approximation of grad using func at the points x0.

See also

approx_fprime

Examples

>>> def func(x):
...     return x[0]**2 - 0.5 * x[1]**3
>>> def grad(x):
...     return [2 * x[0], -1.5 * x[1]**2]
>>> from scipy.optimize import check_grad
>>> check_grad(func, grad, [1.5, -1.5])
2.9802322387695312e-08

Previous topic

scipy.optimize.approx_fprime

Next topic

scipy.optimize.bracket