While copying the parameter named "toolbox_modules.backbone.kx", whose dimensions in the model are torch.Size([3, 3, 3, 3]) and whose dimensions in the checkpoint are torch.Size([3, 3, 3, 3]
Err… what?
Hier is my module definition:
self.kx = nn.Parameter(torch.tensor([[1, 0, -1],
[2, 0, -2],
[1, 0, -1]], dtype=torch.float32).view(1, 1, 3, 3).expand(3, 3, 3, 3),
requires_grad=False)
self.ky = nn.Parameter(torch.tensor([[1, 2, 1],
[0, 0, 0],
[-1, -2, -1]], dtype=torch.float32).view(1, 1, 3, 3).expand(3, 3, 3, 3),
requires_grad=False)