Rate this Page
โ˜… โ˜… โ˜… โ˜… โ˜…

LeakyReLU#

class torch.ao.nn.quantized.LeakyReLU(scale, zero_point, negative_slope=0.01, inplace=False, device=None, dtype=None)[source]#

This is the quantized equivalent of LeakyReLU.

Parameters
  • scale (float) โ€“ quantization scale of the output tensor

  • zero_point (int) โ€“ quantization zero point of the output tensor

  • negative_slope (float) โ€“ Controls the angle of the negative slope. Default: 1e-2