Adding a pre-trainned weights to some channels of a CNN while keeping it frozen and learning on rest of the channels

Hi, I have trained a Unet for denoising application. I wanted to get the residual noise in the denoised image. For this purpose, I wanted to add few trainable channels in the upsampling CNN’s of the trained Unet keeping the weight frozen. As far as my understanding, we can make a whole layer frozen but how to keep few channels frozen and let the other channels of the same layer have gradients. I am also not sure if I have to define a big CNN which can be filled with both pre-trained weights and new weights or I have to concatenate these channels in forward() function.

You can do something like this

for param in model.parameters():
    param.requires_grad = False

You would only set it to true for the layers where you actually want to train.