# scipy.special.rel_entr¶

scipy.special.rel_entr(x, y, out=None) = <ufunc 'rel_entr'>

Elementwise function for computing relative entropy.

$\begin{split}\mathrm{rel\_entr}(x, y) = \begin{cases} x \log(x / y) & x > 0, y > 0 \\ 0 & x = 0, y \ge 0 \\ \infty & \text{otherwise} \end{cases}\end{split}$
Parameters
x, yarray_like

Input arrays

outndarray, optional

Optional output array for the function results

Returns
scalar or ndarray

Relative entropy of the inputs

Notes

New in version 0.15.0.

This function is jointly convex in x and y.

The origin of this function is in convex programming; see [1]. Given two discrete probability distributions $$p_1, \ldots, p_n$$ and $$q_1, \ldots, q_n$$, to get the relative entropy of statistics compute the sum

$\sum_{i = 1}^n \mathrm{rel\_entr}(p_i, q_i).$

See [2] for details.

References

1

Grant, Boyd, and Ye, “CVX: Matlab Software for Disciplined Convex Programming”, http://cvxr.com/cvx/

2

Kullback-Leibler divergence, https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence