I have a puzzle about the torch autograd. When I build a model, and write code:
x.requires_grad = True pred = model(x) pred.sum().backward() print(x.grad)
I can get the grad of x. I want to go forward from here. Is there a method that I can get d(x.sum())/d(pred) using the following code?
x.requires_grad = True pred = model(x) pred.requires_grad = True pred.sum().backward(retain_graph=True) x.sum().backward() print(pred.grad)
I have tried this, but the pred.grad is None. and x.grad.grad_fn is None. Thanks for your help