Hi. When I was testing the following toy example
q = torch.rand(10, 2, requires_grad=True)
p = torch.rand(10, 1, requires_grad=True)
input = torch.cat((q, p), dim=1)
u = torch.sum(input, dim=1)
u_q = torch.autograd.grad(u, q, grad_outputs=torch.ones_like(u), create_graph=True, retain_graph=True)[0]
u_p = torch.autograd.grad(u, p, grad_outputs=torch.ones_like(u), create_graph=True, retain_graph=True)[0]
u_qq = torch.autograd.grad(u_q, q, grad_outputs=torch.ones_like(u_q), create_graph=True, retain_graph=True)[0]
I got the error message:
element 0 of tensors does not require grad and does not have a grad_fn
However, if I change u = torch.sum(input, dim=1) to u = torch.sum(input**1, dim=1), the error disappears. Can anyone explain why?