Rate this Page
โ˜… โ˜… โ˜… โ˜… โ˜…

Softmax#

class torch.nn.modules.activation.Softmax(dim=None)[source]#

Applies the Softmax function to an n-dimensional input Tensor.

Rescales them so that the elements of the n-dimensional output Tensor lie in the range [0,1] and sum to 1.

Softmax is defined as:

Softmax(xi)=expโก(xi)โˆ‘jexpโก(xj)\text{Softmax}(x_{i}) = \frac{\exp(x_i)}{\sum_j \exp(x_j)}

When the input Tensor is a sparse tensor then the unspecified values are treated as -inf.

Shape:
  • Input: (โˆ—)(*) where * means, any number of additional dimensions

  • Output: (โˆ—)(*), same shape as the input

Returns

a Tensor of the same dimension and shape as the input with values in the range [0, 1]

Parameters

dim (int) โ€“ A dimension along which Softmax will be computed (so every slice along dim will sum to 1).

Return type

None

Note

This module doesnโ€™t work directly with NLLLoss, which expects the Log to be computed between the Softmax and itself. Use LogSoftmax instead (itโ€™s faster and has better numerical properties).

Examples:

>>> m = nn.Softmax(dim=1)
>>> input = torch.randn(2, 3)
>>> output = m(input)
extra_repr()[source]#

Return the extra representation of the module.

Return type

str

forward(input)[source]#

Runs the forward pass.

Return type

Tensor