Parameter not updating

When I do something like this,

class Custom(nn.Module):
    def __init__(self):     
        super().__init__() 
        self.threshold = nn.Parameter(torch.randn(1))
    def forward(self, x):
       print(self.threshold)
       return x
class Net(nn.Module):
    def __init__(self):     
        super().__init__() 
        self.custom = Custom()
    def forward(self, x):
       x = self.custom(x)
       return x

when I do

net = Net()
list(net.parameters())

then it does show the threshold value, plus requires_grad is set as True for it.
but when I do, optimizer.step() later, then the threshold parameter do not update, it stays to one fixed value.

1 Like

Do you have a loss function? Normally, one should do:

net = Net()
pred = net(X)
loss = loss_fn(pred, y)
loss.backward()
optim.step()

If you did not calculate the error, pytorch does not know how to update the parameters…

Hello, thanks for your reply, the parameters described in Net class get updated, but the parameters described in Custom class do not get updated, that is the threshold does not get updated. I am doing the get loss, backprop, optimize part.

Would you have a small working example to reproduce the issue?