How should I keep gradients for reassiged variable?

I am trying to implement a version of iterative FGSM. I want to iteratively backpropagate and then reassign a variable like so:

def fgsm_attack(image, epsilon, data_grad):
    # Collect the element-wise sign of the data gradient
    sign_data_grad = data_grad.sign()
    # Create the perturbed image by adjusting each pixel of the input image
    perturbed_image = image + epsilon*sign_data_grad
    # Adding clipping to maintain [0,1] range
    perturbed_image = torch.clamp(perturbed_image, 0, 1)
    # Return the perturbed image
    return perturbed_image

I iteratively call this function, but the problem is that “perturbed_image” loses its gradients from “image.” But, I cannot set perturbed_image.requires_grad = True because it is not a leaf variable. What’s the correct solution to this?

So you want to change the value inside image to the perturbed value? Or you want to keep both?

Do you actually want to have the same .grad field as image? Or just make the perturbed image a leaf so that you can get gradients for it?