Hi,
Can I use the indexing for requires_grad function like layer.weight.data[out_idx, in_idx, :, :].requires_grad = False, to set the channels of my choice to be frozen while training and the weights for these channels must not be updated.
Hi @anantguptadbl thanks for your reply. My problem is little different. I am pruning and rebuilding a model. I want to freeze lets say only one channel of a Conv Layer i.e say the index 0 in the output channels of the Conv layer weights. I am not able to find the solution as the Torch functionality sets the whole tensor gradients to false. One solution I am working on is to freeze the gradients of the tensor to 0 in that particular index. Hence, I don’t want to have whole layer as non-trainable but only a part of that layer.