How to load part of model when the model contains more layers than the save weights?


#1

If they have the same number of layers, I know that How to load part of pre trained model? works.

However, if my new model has m+n layers and my old model has m layers, pytorch will complain about missing layers. How should I load the model then?


#2

The pre-trained model is loaded as a OrderedDict by calling torch.load(), you can then extract weights from the dictionary and do what you want. For example, in your case, you could get your model’s state_dict, then assign weights to layers of interests and load the dict using model.load_state_dict().