For example, we have a model only containing a single conv2d layer(1 feature to 1 feature and kernel size of 3). How do I fixed the center weight of this conv layer?
Last time I tried, I used model.conv2d.weight.grad[index for center weight]=0. It does set the weight.grad to zero, but after calling optimizer.step(), the center weight is still updated. I called loss.backward() before mannually setting center weight grad to zero.
Zeroing out the gradients should work for optimizers without running estimates.
If you are using an optimizer with internal states (e.g. Adam), this approach will still work, if you zero out the gradients from the beginning:
In my case, I have more than one conv layers, and for almost all of these layers, I have to fix part of the parameters of the layer. Is there a convenient way?
If I use conv.weight.grad[:, :, 1, 1] = 0. I have to do this for every layers, which is not elegant and convenient.
Would using a for loop and the layer names be a solution?
You would have to somehow access each weight parameter of all layers you would like to set to zero.