Linear layer throws dimension error

Hi folks :wave:

I have the following linear layer:

self.decoder = nn.Linear(
    in_features=192,
    out_features=4,
    bias=False
)
self.bias = nn.Parameter(torch.zeros(4))
self.decoder.bias = self.bias

But when I run

self.decoder(torch.randn(64, 20, 192))

I get a dimensions error:

Exception has occurred: RuntimeError
mat1 and mat2 shapes cannot be multiplied (1280x192 and 4x192)

Why is it that the linear layer is considering the input_features to be 4? The most weird thing is that if I run in a python terminal:

linear = nn.Linear(192, 4)
linear(torch.randn(64, 20, 192))

Then it works. Those two layers are exactly the same but within my model it complains

I cannot reproduce the issue given your code:

class MyModel(nn.Module):
    def __init__(self):
        super().__init__()
        self.decoder = nn.Linear(
            in_features=192,
            out_features=4,
            bias=False
        )
        self.bias = nn.Parameter(torch.zeros(4))
        self.decoder.bias = self.bias
        
    def forward(self, x):
        x = self.decoder(x)
        return x
    
model = MyModel()
x = torch.randn(64, 20, 192)
out = model(x)
print(out.shape)
# > torch.Size([64, 20, 4])