Rate this Page
โ˜… โ˜… โ˜… โ˜… โ˜…

torch.nn.functional.softmax#

torch.nn.functional.softmax(input, dim=None, _stacklevel=3, dtype=None)[source]#

Apply a softmax function.

Softmax is defined as:

Softmax(xi)=expโก(xi)โˆ‘jexpโก(xj)\text{Softmax}(x_{i}) = \frac{\exp(x_i)}{\sum_j \exp(x_j)}

It is applied to all slices along dim, and will re-scale them so that the elements lie in the range [0, 1] and sum to 1.

See Softmax for more details.

Parameters
  • input (Tensor) โ€“ input

  • dim (int) โ€“ A dimension along which softmax will be computed.

  • dtype (torch.dtype, optional) โ€“ the desired data type of returned tensor. If specified, the input tensor is casted to dtype before the operation is performed. This is useful for preventing data type overflows. Default: None.

Return type

Tensor

Note

This function doesnโ€™t work directly with NLLLoss, which expects the Log to be computed between the Softmax and itself. Use log_softmax instead (itโ€™s faster and has better numerical properties).