Suppose we have a expression: z = x^2 *y + y, where x, y are both Tensors.
We want to calculate d(d z / d x) / dy. Here d is the partial derivative operator.
I do as follows:
import torch x = torch.Tensor() y = torch.Tensor() x.requires_grad_() y.requires_grad_() loss = x**2 * y + y loss.backward(create_graph=True, retain_graph=True) xx = x.grad xx.requires_grad_() result = torch.autograd.grad(xx, y)
This piece of code can get correct result. However, I wander if this is the correct way to calculate, or I just get the correct result by coincidence.