Size mismatch while loading state dictionary


While loading a pre-trained model, I am getting the same runtime error for every layer.
E.g. size mismatch for conv_batchnorm.weight: copying a param with shape torch.Size([1, 32]) from checkpoint, the shape in current model is torch.Size([32])

Kindly help me figure out how to solve this problem.


Is conv_batchnorm an nn.BatchNorm layer?
Was the state_dict saved in another PyTorch version than the one you are using to reload it?