This is documentation for an old release of SciPy (version 0.9.0). Read this page in the documentation of the latest stable release (version 1.15.1).
Calculate the standard deviation of the values of an n-D image array, optionally at specified sub-regions.
Parameters : | input : array_like
labels : array_like, or None, optional
index : None, int, or sequence of int, optional
|
---|---|
Returns : | std : float or ndarray
|
Examples
>>> a = np.array([[1, 2, 0, 0],
[5, 3, 0, 4],
[0, 0, 0, 7],
[9, 3, 0, 0]])
>>> from scipy import ndimage
>>> ndimage.standard_deviation(a)
2.7585095613392387
Features to process can be specified using labels and index:
>>> lbl, nlbl = ndimage.label(a)
>>> ndimage.standard_deviation(a, lbl, index=np.arange(1, nlbl+1))
array([ 1.479, 1.5 , 3. ])
If no index is given, non-zero labels are processed:
>>> ndimage.standard_deviation(a, lbl)
2.4874685927665499