How to freeze a single weight of a conv2d layer

For example, we have a model only containing a single conv2d layer(1 feature to 1 feature and kernel size of 3). How do I fixed the center weight of this conv layer?

Last time I tried, I used model.conv2d.weight.grad[index for center weight]=0. It does set the weight.grad to zero, but after calling optimizer.step(), the center weight is still updated. I called loss.backward() before mannually setting center weight grad to zero.

So how do I do things like this in pytorch?

1 Like

Zeroing out the gradients should work for optimizers without running estimates.
If you are using an optimizer with internal states (e.g. Adam), this approach will still work, if you zero out the gradients from the beginning:

conv = nn.Conv2d(1, 1 ,3)
weight_reference = conv.weight.clone()
optimizer = torch.optim.Adam(conv.parameters(), lr=1.)

x = torch.randn(1, 1, 10, 10)
out = conv(x)
out.mean().backward()

conv.weight.grad[:, :, 1, 1] = 0.
optimizer.step()

print(weight_reference == conv.weight)
> tensor([[[[False, False, False],
          [False,  True, False],
          [False, False, False]]]])

However, after a valid update was performed, the running averages might update the parameter further.

2 Likes

Very clean answer. Thanks.

Hi, @ptrblck

thank you very much for your answer.

In my case, I have more than one conv layers, and for almost all of these layers, I have to fix part of the parameters of the layer. Is there a convenient way?

If I use conv.weight.grad[:, :, 1, 1] = 0. I have to do this for every layers, which is not elegant and convenient.

Would using a for loop and the layer names be a solution?
You would have to somehow access each weight parameter of all layers you would like to set to zero.

1 Like