I trained a model without any dropout, and now I want to add some dropout layers and resume the training using the last checkpoint I have. However, when I try to do so, I get these errors:
FLOPs: 387.73M, Params: 1.15M
=> loading checkpoint './snapshots/imagenet/simpnets/1mil/nodrp/chkpt_simpnet_imgnet_1m_nodrp_s1_2018-07-08_17-00-55.pth.tar'
Traceback (most recent call last):
File "imagenet_train.py", line 537, in <module>
main()
File "imagenet_train.py", line 122, in main
model.load_state_dict(checkpoint['state_dict'])
File "/home/shishosama/anaconda3/lib/python3.6/site-packages/torch/nn/modules/module.py", line 721, in load_state_dict
self.__class__.__name__, "\n\t".join(error_msgs)))
RuntimeError: Error(s) in loading state_dict for DataParallel:
Missing key(s) in state_dict: "module.features.5.weight", "module.features.5.bias", "module.features.5.running_mean", "module.features.5.running_var", "module.features.8.weight", "module.features.8.bias", "module.features.9.running_mean", "module.features.9.running_var", "module.features.21.weight", "module.features.21.bias", "module.features.22.running_mean", "module.features.22.running_var", "module.features.30.weight", "module.features.30.bias", "module.features.30.running_mean", "module.features.30.running_var", "module.features.34.weight", "module.features.34.bias", "module.features.34.running_mean", "module.features.34.running_var", "module.features.37.weight", "module.features.37.bias", "module.features.38.running_mean", "module.features.38.running_var", "module.features.42.weight", "module.features.42.bias", "module.features.43.weight", "module.features.43.bias", "module.features.43.running_mean", "module.features.43.running_var", "module.features.46.weight", "module.features.46.bias", "module.features.47.weight", "module.features.47.bias", "module.features.47.running_mean", "module.features.47.running_var", "module.features.50.weight", "module.features.50.bias", "module.features.51.weight", "module.features.51.bias", "module.features.51.running_mean", "module.features.51.running_var".
Unexpected key(s) in state_dict: "module.features.3.weight", "module.features.3.bias", "module.features.4.running_mean", "module.features.4.running_var", "module.features.6.weight", "module.features.6.bias", "module.features.7.weight", "module.features.7.bias", "module.features.7.running_mean", "module.features.7.running_var", "module.features.10.weight", "module.features.10.bias", "module.features.10.running_mean", "module.features.10.running_var", "module.features.19.weight", "module.features.19.bias", "module.features.20.weight", "module.features.20.bias", "module.features.20.running_mean", "module.features.20.running_var", "module.features.23.weight", "module.features.23.bias", "module.features.23.running_mean", "module.features.23.running_var", "module.features.28.weight", "module.features.28.bias", "module.features.29.running_mean", "module.features.29.running_var", "module.features.32.weight", "module.features.32.bias", "module.features.33.running_mean", "module.features.33.running_var", "module.features.35.weight", "module.features.35.bias", "module.features.36.weight", "module.features.36.bias", "module.features.36.running_mean", "module.features.36.running_var", "module.features.39.weight", "module.features.39.bias", "module.features.39.running_mean", "module.features.39.running_var".
While copying the parameter named "module.features.4.weight", whose dimensions in the model are torch.Size([80, 60, 3, 3]) and whose dimensions in the checkpoint are torch.Size([80]).
While copying the parameter named "module.features.9.weight", whose dimensions in the model are torch.Size([80]) and whose dimensions in the checkpoint are torch.Size([80, 80, 3, 3]).
While copying the parameter named "module.features.12.weight", whose dimensions in the model are torch.Size([80, 80, 3, 3]) and whose dimensions in the checkpoint are torch.Size([85, 80, 3, 3]).
While copying the parameter named "module.features.12.bias", whose dimensions in the model are torch.Size([80]) and whose dimensions in the checkpoint are torch.Size([85]).
While copying the parameter named "module.features.13.weight", whose dimensions in the model are torch.Size([80]) and whose dimensions in the checkpoint are torch.Size([85]).
While copying the parameter named "module.features.13.bias", whose dimensions in the model are torch.Size([80]) and whose dimensions in the checkpoint are torch.Size([85]).
While copying the parameter named "module.features.13.running_mean", whose dimensions in the model are torch.Size([80]) and whose dimensions in the checkpoint are torch.Size([85]).
While copying the parameter named "module.features.13.running_var", whose dimensions in the model are torch.Size([80]) and whose dimensions in the checkpoint are torch.Size([85]).
While copying the parameter named "module.features.16.weight", whose dimensions in the model are torch.Size([85, 80, 3, 3]) and whose dimensions in the checkpoint are torch.Size([85, 85, 3, 3]).
While copying the parameter named "module.features.22.weight", whose dimensions in the model are torch.Size([85]) and whose dimensions in the checkpoint are torch.Size([90, 90, 3, 3]).
While copying the parameter named "module.features.22.bias", whose dimensions in the model are torch.Size([85]) and whose dimensions in the checkpoint are torch.Size([90]).
While copying the parameter named "module.features.25.weight", whose dimensions in the model are torch.Size([90, 85, 3, 3]) and whose dimensions in the checkpoint are torch.Size([90, 90, 3, 3]).
While copying the parameter named "module.features.29.weight", whose dimensions in the model are torch.Size([90, 90, 3, 3]) and whose dimensions in the checkpoint are torch.Size([110]).
While copying the parameter named "module.features.29.bias", whose dimensions in the model are torch.Size([90]) and whose dimensions in the checkpoint are torch.Size([110]).
While copying the parameter named "module.features.33.weight", whose dimensions in the model are torch.Size([90, 90, 3, 3]) and whose dimensions in the checkpoint are torch.Size([110]).
While copying the parameter named "module.features.33.bias", whose dimensions in the model are torch.Size([90]) and whose dimensions in the checkpoint are torch.Size([110]).
While copying the parameter named "module.features.38.weight", whose dimensions in the model are torch.Size([110]) and whose dimensions in the checkpoint are torch.Size([150, 127, 3, 3]).
While copying the parameter named "module.features.38.bias", whose dimensions in the model are torch.Size([110]) and whose dimensions in the checkpoint are torch.Size([150]).
What should I do?
Thanks in advance