SciPy

scipy.stats.entropy

scipy.stats.entropy(pk, qk=None, base=None)[source]

Calculate the entropy of a distribution for given probability values.

If only probabilities pk are given, the entropy is calculated as S = -sum(pk * log(pk), axis=0).

If qk is not None, then compute the Kullback-Leibler divergence S = sum(pk * log(pk / qk), axis=0).

This routine will normalize pk and qk if they don’t sum to 1.

Parameters
pksequence

Defines the (discrete) distribution. pk[i] is the (possibly unnormalized) probability of event i.

qksequence, optional

Sequence against which the relative entropy is computed. Should be in the same format as pk.

basefloat, optional

The logarithmic base to use, defaults to e (natural logarithm).

Returns
Sfloat

The calculated entropy.

Previous topic

scipy.stats.mvsdist

Next topic

scipy.stats.median_absolute_deviation