Rate this Page
โ˜… โ˜… โ˜… โ˜… โ˜…

ReLU#

class torch.nn.ReLU(inplace=False)[source]#

Applies the rectified linear unit function element-wise.

ReLU(x)=(x)+=maxโก(0,x)\text{ReLU}(x) = (x)^+ = \max(0, x)

Parameters

inplace (bool) โ€“ can optionally do the operation in-place. Default: False

Shape:
  • Input: (โˆ—)(*), where โˆ—* means any number of dimensions.

  • Output: (โˆ—)(*), same shape as the input.

../_images/ReLU.png

Examples:

  >>> m = nn.ReLU()
  >>> input = torch.randn(2)
  >>> output = m(input)


An implementation of CReLU - https://arxiv.org/abs/1603.05201

  >>> m = nn.ReLU()
  >>> input = torch.randn(2).unsqueeze(0)
  >>> output = torch.cat((m(input), m(-input)))