One of the differentiated Tensors appears to not have been used in the graph. Set allow_unused=True if this is the desired behavior

 loss   = F.cross_entropy(preds, y.unsqueeze(0))


 grad   = torch.autograd.grad(eps*loss, z, grad_outputs = None, only_inputs = True, retain_graph = False)[0]
 z_adv  = (z+grad).detach().clone()
 one_hot_y = torch.eye(10)[y].unsqueeze(0).detach().cuda()
 z      = z.clone().detach()
 z_adv  = (z+grad).clone().detach()
 one_hot_y.requires_grad = False
 x_adv  = generator(z_adv, one_hot_y)
 logit_eps = 10e-9 

 with torch.enable_grad():
        z_adv.requires_grad = True
        logits = model(x_adv)
       
        loss  = F.cross_entropy(logits, y.unsqueeze(0))
        grad  = torch.autograd.grad(eps*loss, z_adv, grad_outputs = None, only_inputs = False, retain_graph = False)[0]

When I ran above code, I got

RuntimeError: One of the differentiated Tensors appears to not have been used in the graph. Set allow_unused=True if this is the desired behavior.

How to solve this problem?