scipy.special.entr#
- scipy.special.entr(x, out=None) = <ufunc 'entr'>#
Elementwise function for computing entropy.
\[\begin{split}\text{entr}(x) = \begin{cases} - x \log(x) & x > 0 \\ 0 & x = 0 \\ -\infty & \text{otherwise} \end{cases}\end{split}\]- Parameters:
- xndarray
Input array.
- outndarray, optional
Optional output array for the function values
- Returns:
- resscalar or ndarray
The value of the elementwise entropy function at the given points x.
See also
Notes
Added in version 0.15.0.
This function is concave.
The origin of this function is in convex programming; see [1]. Given a probability distribution \(p_1, \ldots, p_n\), the definition of entropy in the context of information theory is
\[\sum_{i = 1}^n \mathrm{entr}(p_i).\]To compute the latter quantity, use
scipy.stats.entropy
.References
[1]Boyd, Stephen and Lieven Vandenberghe. Convex optimization. Cambridge University Press, 2004. DOI:https://doi.org/10.1017/CBO9780511804441