Load_state_dict with same layers but different containers

Hi everyone,

I have a file with the pretrained weights for a model.
I also have a slightly modified version of the model, comprised of the same layers, in the same order, but with those layers being contained in different nn.Sequentials

Using load_state_dict with strict set to True gives the error:
unexpected key "module.features.0.0.weight" in state_dict

With strict set to false, the weights are not properly put into the network.

Is there any way of correctly loading the weights?