Rate this Page
โ˜… โ˜… โ˜… โ˜… โ˜…

leaky_relu#

class torch.ao.nn.quantized.functional.leaky_relu(input, negative_slope=0.01, inplace=False, scale=None, zero_point=None)[source]#

Quantized version of the. leaky_relu(input, negative_slope=0.01, inplace=False, scale, zero_point) -> Tensor

Applies element-wise, LeakyReLU(x)=maxโก(0,x)+negative_slopeโˆ—minโก(0,x)\text{LeakyReLU}(x) = \max(0, x) + \text{negative\_slope} * \min(0, x)

Parameters
  • input (Tensor) โ€“ Quantized input

  • negative_slope (float) โ€“ The slope of the negative input

  • inplace (bool) โ€“ Inplace modification of the input tensor

  • scale (Optional[float]) โ€“ Scale and zero point of the output tensor.

  • zero_point (Optional[int]) โ€“ Scale and zero point of the output tensor.

See LeakyReLU for more details.