Hey, I need to apply a single operation on a tensor that I don’t want to be recorded for the computation graph. Also, I need that previous computation graph operations will be saved and won’t be deleted.
I tried to implement it with torch.no_grad()
by this way:
import torch
t = torch.rand(1).requires_grad_()
# some previous tensor operations
t = t.add(0)
# operations that I dont want to be recorded
with torch.no_grad():
t = t.sub(0)
the result is that t.grad_fn = None
, but I would want it to be AddBackward0
is there any convenient way to do so? Thanks.