Hi. I’d like to ask about tensor.detach()
I’ve read this post already. But i’d like to ask. If performing
tensor_copy = tensor.detach()
,
This will create a copy of tensor
detached from the graph and store the detached copy in tensor_copy
. My question is, when performing this operation, does tensor
gets removed from the computation graph? Or will it still remain? (I know that tensor_copy
will be removed from the graph, but I’m not sure about tensor
).
This is because I’d like to modify one of my variables which requires grad, perform and operation and then use it again along with the original tensor
. But when I do .detach()
, I get an error that the tensor that required grad is modified when performing loss.backward()
Thanks!