Modify intermediate layer

I wondered how would you modify the out_channels of an intermediate layer of a pre-trained ResNet?
I am not talking about the fully connected layer, but for example a conv2 layer?

Thank you!

Does it work?

So there are several aspects to it:

  • The for the forward to go through, the layer following the layer for which you change out_channels would need to be adapted, too, to expect the new number of channels.
  • The typical thing to change the last layer is to replace it and train the weights of the new one from scratch. The more or less direct analogue would be to train the changed layer and everything following it on your task.
  • In principle, it also should be possible to try to only the two layers that need to be modified and train them to produce the same outputs as the two layers they are replacing. This is could be considered to be in the family of techniques known as “model distillation”.

Best regards


The second advice did it for me, thank you!