How to update only one coordinate of a layer?

For example, I have a simple network like this

class Net(nn.Module):
  def __init__(self,input_shape):
    super(Net,self).__init__()
    self.fc1 = nn.Linear(20, 1)
  def forward(self,x):
    x = self.fc1(x)
    return x

fc1 should have 20 parameters, if I don’t want to update all of them in a single step, instead, I want to update one of those parameters in a customized way. How should I do that? Thanks for any help.

Take a look at @KFrank’s answer in this post. :wink:

Ok what I think you can try is to call .backward() but don’t call .step() immediately. Instead, you can access the individual gradients generated by .backward() using:

self.fc1.weight.grad
self.fc1.bias.grad

And call .step() after you are done or even apply the gradients manually.