When I do something like this,
class Custom(nn.Module):
def __init__(self):
super().__init__()
self.threshold = nn.Parameter(torch.randn(1))
def forward(self, x):
print(self.threshold)
return x
class Net(nn.Module):
def __init__(self):
super().__init__()
self.custom = Custom()
def forward(self, x):
x = self.custom(x)
return x
when I do
net = Net()
list(net.parameters())
then it does show the threshold value, plus requires_grad is set as True for it.
but when I do, optimizer.step() later, then the threshold parameter do not update, it stays to one fixed value.