Can't backward the loss

I;m trying to perform gradient descent on a point, but fail in loss.backward() because requires_grad is False.

My code:

class Net(nn.Module):
    def __init__(self, p, P1, P2):
        super(Net, self).__init__()
        self.p = torch.tensor(p, dtype=torch.float32, requires_grad=True)
        self.P1 = torch.tensor(P1, dtype=torch.float32)
        self.P2 = torch.tensor(P2, dtype=torch.float32) = torch.tensor([1], dtype=torch.float32)

    def forward(self):
        p =[self.p,])
        u1_h = torch.matmul(self.P1, p)
        u2_h = torch.matmul(self.P2, p)
        u1 = u1_h[:2] / u1_h[2]
        u2 = u2_h[:2] / u2_h[2]
        return[u1, u2])

net = Net(out_3d[landmark_idx, :], P1, P2)
criterion = nn.MSELoss()
optimizer = optim.SGD([net.p], lr=0.001, momentum=0.9)
label = torch.Tensor([[u1, v1, u2, v2]])
for iter in range(10):
    outputs = net()
    loss = criterion(outputs, label)
    print(f'landmark #{landmark_idx}: {iter} - {loss}')

Apparantly, after p =[self.p,]) p.requires_grad is already False

Any ideas why?

I don’t know which shapes you’ve used and how the undefined variables are used, but this code snippet shows that output has a valid grad_fn and also its .requires_grad attribute is True:

b = torch.randn(1, 1)
c = torch.randn(3, 2)
net = Net(b, c, c)

outputs = net()
# <CatBackward0 object at 0x7f552df74f40>
# True

outputs.mean().backward() # works

First of all: Thanks a lot for the quick reply :slight_smile:

Second: You’re right of course. I didn’t run the snippet as standalone, and in the code I have above it, there’s a with torch.no_grad(): :man_facepalming: