scipy.special.kl_div#
- scipy.special.kl_div(x, y, out=None) = <ufunc 'kl_div'>#
Elementwise function for computing Kullback-Leibler divergence.
\[\begin{split}\mathrm{kl\_div}(x, y) = \begin{cases} x \log(x / y) - x + y & x > 0, y > 0 \\ y & x = 0, y \ge 0 \\ \infty & \text{otherwise} \end{cases}\end{split}\]- Parameters:
- x, yarray_like
Real arguments
- outndarray, optional
Optional output array for the function results
- Returns:
- scalar or ndarray
Values of the Kullback-Liebler divergence.
See also
Notes
Added in version 0.15.0.
This function is non-negative and is jointly convex in x and y.
The origin of this function is in convex programming; see [1] for details. This is why the function contains the extra \(-x + y\) terms over what might be expected from the Kullback-Leibler divergence. For a version of the function without the extra terms, see
rel_entr
.References
[1]Boyd, Stephen and Lieven Vandenberghe. Convex optimization. Cambridge University Press, 2004. DOI:https://doi.org/10.1017/CBO9780511804441