Does detach make a copy of the tensor while still mainintng the tensor in the graph?

Hi. I’d like to ask about tensor.detach()
I’ve read this post already. But i’d like to ask. If performing
tensor_copy = tensor.detach(),
This will create a copy of tensor detached from the graph and store the detached copy in tensor_copy. My question is, when performing this operation, does tensor gets removed from the computation graph? Or will it still remain? (I know that tensor_copy will be removed from the graph, but I’m not sure about tensor).

This is because I’d like to modify one of my variables which requires grad, perform and operation and then use it again along with the original tensor. But when I do .detach(), I get an error that the tensor that required grad is modified when performing loss.backward()

Thanks!

@albanD may you please help? Since you’ve answered previous posts regarding this question.

Hi,

.detach() returns a new Tensor without history but they share content. To get an actual copy, you can add a .clone() after the detach().

3 Likes

Hi @albanD.
Thanks for your reply. My question is not to get an actual copy. It is wether the actual copy (tensor in my example) will be removed from the graph.

tensor.detach() is an out of place operation. So it does not modify tensor iteself.