Rate this Page
โ˜… โ˜… โ˜… โ˜… โ˜…

Bilinear#

class torch.nn.modules.linear.Bilinear(in1_features, in2_features, out_features, bias=True, device=None, dtype=None)[source]#

Applies a bilinear transformation to the incoming data: y=x1TAx2+by = x_1^T A x_2 + b.

Parameters
  • in1_features (int) โ€“ size of each first input sample, must be > 0

  • in2_features (int) โ€“ size of each second input sample, must be > 0

  • out_features (int) โ€“ size of each output sample, must be > 0

  • bias (bool) โ€“ If set to False, the layer will not learn an additive bias. Default: True

Shape:
  • Input1: (โˆ—,Hin1)(*, H_\text{in1}) where Hin1=in1_featuresH_\text{in1}=\text{in1\_features} and โˆ—* means any number of additional dimensions including none. All but the last dimension of the inputs should be the same.

  • Input2: (โˆ—,Hin2)(*, H_\text{in2}) where Hin2=in2_featuresH_\text{in2}=\text{in2\_features}.

  • Output: (โˆ—,Hout)(*, H_\text{out}) where Hout=out_featuresH_\text{out}=\text{out\_features} and all but the last dimension are the same shape as the input.

Variables
  • weight (torch.Tensor) โ€“ the learnable weights of the module of shape (out_features,in1_features,in2_features)(\text{out\_features}, \text{in1\_features}, \text{in2\_features}). The values are initialized from U(โˆ’k,k)\mathcal{U}(-\sqrt{k}, \sqrt{k}), where k=1in1_featuresk = \frac{1}{\text{in1\_features}}

  • bias โ€“ the learnable bias of the module of shape (out_features)(\text{out\_features}). If bias is True, the values are initialized from U(โˆ’k,k)\mathcal{U}(-\sqrt{k}, \sqrt{k}), where k=1in1_featuresk = \frac{1}{\text{in1\_features}}

Examples:

>>> m = nn.Bilinear(20, 30, 40)
>>> input1 = torch.randn(128, 20)
>>> input2 = torch.randn(128, 30)
>>> output = m(input1, input2)
>>> print(output.size())
torch.Size([128, 40])
extra_repr()[source]#

Return the extra representation of the module.

Return type

str

forward(input1, input2)[source]#

Runs the forward pass.

Return type

Tensor

reset_parameters()[source]#

Resets parameters based on their initialization used in __init__.