A question about high order derivative

I want to use pytorch to calculate the following derivatives:∂((x+0.1*∂(xy)/∂x)^2)/∂y。It should be 0.1(x+0.1y) since ∂(xy)/∂y=y and ∂(x+0.1y)^2/∂y=(x+0.1y)*0.1
I try to solve the above problem with the following code

import torch
torch.manual_seed(11037)
x1 = torch.rand(1).requires_grad_(True)
y1 = torch.rand(1).requires_grad_(True)
print(x1)
print(y1)
loss1 = x1 * y1
grad1 = torch.autograd.grad(loss1, x1, retain_graph=True)
x2 = x1 + grad1[0] * 0.1
loss2 = x2 * x2
loss2.backward()
print(y1.grad)
print(x2 * 0.1)

However, the results of y1.grad is None. I can somehow understand why this is the case, because y1 does not appear explicitly in the expression for loss2. If I change loss2=x2 * x2 to loss2=x2 * y1, the result will be true.
So what should I do? I encounter this problem when implement this paper “Training Meta-Surrogate Model for Transferable Adversarial Attack”. The actual problem I’m trying to solve is to generate an adversarial sample based on model A, feed it to model B to get the loss, and pass the gradient back to model A.

Hi Dingcheng!

This line should be:

grad1 = torch.autograd.grad (loss1, x1, create_graph = True)

Note the create_graph.

Best.

K. Frank

1 Like

Thanks!!! It works!!!