Does torch.autograd.grad save gradient?

I’m trying to get a gradient of some tensor with respect to a loss function by using torch.autograd.grad
but the problem is that I don’t want the gradient of other tensors or model parameters to be saved in their .grad.
I’m not really sure if torch.autograd.grad does that.
Thanks in advance


autograd.grad does not touch the .grad fields. It returns directly all the gradients it computes. And it only computes gradients that are needed.

Thanks a lot It helped!