scipy.special.log_softmax#

scipy.special.log_softmax(x, axis=None)[source]#

Logarithm of softmax function:

log_softmax(x) = log(softmax(x))
Parameters
xarray_like

Input array.

axisint or tuple of ints, optional

Axis to compute values along. Default is None and softmax will be computed over the entire array x.

Returns
sndarray or scalar

An array with the same shape as x. Exponential of the result will sum to 1 along the specified axis. If x is a scalar, a scalar is returned.

Notes

log_softmax is more accurate than np.log(softmax(x)) with inputs that make softmax saturate (see examples below).

New in version 1.5.0.

Examples

>>> from scipy.special import log_softmax
>>> from scipy.special import softmax
>>> np.set_printoptions(precision=5)
>>> x = np.array([1000.0, 1.0])
>>> y = log_softmax(x)
>>> y
array([   0., -999.])
>>> with np.errstate(divide='ignore'):
...   y = np.log(softmax(x))
...
>>> y
array([  0., -inf])