The problem I think is that you set j.requires_grad after you used it. So the graph is not created when you do the forward.
You should move this at the beginning of the forward() function.
I’m experiencing the same problem but due to something different probably.
This is my code, I’m declaring requires_grad_(True) and then doing loss.backward() but in the end None is printed which is very odd… I’m using the newest version of torch/torchvision 1.9.1 and 0.10.1 respectively. If anyone has a clue for what might be happening let me know!
net.eval()
adv_x = x.clone().detach().float().requires_grad_(True)
# start from a random point near x
rand = torch.zeros_like(x).uniform_(-eps, eps).float()
adv_x = adv_x + rand
adv_x = torch.clamp(adv_x, x_min, x_max)
if not targeted:
target = torch.argmax(net.forward(x))
criterion = nn.CrossEntropyLoss()
for _ in range(n_iters):
pred = net.forward(adv_x)
loss = criterion(pred, target)
if targeted:
loss = -loss
loss.backward()
print(adv_x.grad)
If you are trying to access the .grad attribute of adv_x, you will also get a warning which explains the returned None value:
y = adv_x * 2
y.backward()
print(adv_x.grad)
> None
UserWarning: The .grad attribute of a Tensor that is not a leaf Tensor is being accessed. Its .grad attribute won't be populated during autograd.backward(). If you indeed want the gradient for a non-leaf Tensor, use .retain_grad() on the non-leaf Tensor. If you access the non-leaf Tensor by mistake, make sure you access the leaf Tensor instead. See github.com/pytorch/pytorch/pull/30531 for more information.
I don’t know where exactly you are creating this cloned tensor, but alternatively you could also use other variable names to make sure that the original adv_x isn’t overwritten.
Hm yes indeed but adv_x is updated later in the code and in my algorithm the grad is computed from the start of each iteration, with the new value of adv_x. Meaning I would have to clone a variable there either way (obviously my code before wouldn’t have worked properly, the grad would be wrong :))! Thanks!