Assignment causes grad attribute to disappear?

    image2 = image + epsilon*sign_data_grad

In the above line of code, image.grad exists, but afterwards image2.grad no longer exists. How do I keep it there?

Yes, the name image2 gets assigned a new tensor (the result of the computation on the rhs).
If you want to keep the gradient, you would need to use an inplace operation (with all that entails) à la

with torch.no_grad():
  image2.copy_(image + epsilon*sign_data_grad)

(or add_ or += if you actually want to only add to image2).

Best regards

Thomas

Thanks for the response. I tried that, but I got: RuntimeError: a leaf Variable that requires grad has been used in an in-place operation.

Do you know why this is a problem?