Loading weight before the fc layers

I want to load ImageNet pre-trained Resnet weight file before the fully-connected layer.

I mean, I just want to load CNN weight parameters only.

Is there any way to do that? Please help me :slight_smile:

The easiest way would be to load the complete model and manipulate the linear layer(s) afterwards.
Would that be an option? Otherwise you could probably filter out the unwanted parameters from your state_dict and load them using strict=False.

Thank you for your replying!

Yup. FC layer modifying is available.
actually, I just want to reduce the FC layers of original Resnet and Vgg network with its dimensions.

Then, If I download the pre-trained network with original structure and load that model with weight parameters on the code and change the FC layer structure, then every weight parameters in the convolutional layers are still alive?

If weight parameters in CNN layer are still alive after changing the dimension of FC layers, then how can I change or reduce the dimension or the number of FC layers?

You would just have to load the pre-trained model and change the last linear layer to the new number of classes as described in the Transfer Learning tutorial.

1 Like

I’ll try it. Thank you!