Changing value of variable while computing gradient

I would like to change the value of a variable while we compute the gradient between two iterations. As an example, consider the following code:

optimizer = torch.optim.Adam([{'params':param1, 'lr':1e-2}])
for ite in range(ite_max):
   loss = compute_loss(param1)

   # Here, we store the new value of the param1
  store_value = param1.detach().clone()

  # Now, I want to change the value of param1 to compute a new grad 
  # where the new_value change at each ite
  param1 = torch.tensor([new_value], requires_grad = True)

My question is about my last line because I guess it is wrong but I don’t really know how to do it.