For example, we have a model only containing a single conv2d layer(1 feature to 1 feature and kernel size of 3). How do I fixed the center weight of this conv layer?
Last time I tried, I used model.conv2d.weight.grad[index for center weight]=0. It does set the weight.grad to zero, but after calling optimizer.step(), the center weight is still updated. I called loss.backward() before mannually setting center weight grad to zero.
So how do I do things like this in pytorch?