Afer tensor.detach()
operation, the tensor will be a leaf node and the grad will not be backpropagated even if the tensor is the intermediate output.
Is this because the all leaf node wouldn’t backpropagate? Or other reasons?
thanks.
1 Like
Hi,
No it is because the .detach()
breaks the link between the two. And so the backprop cannot go back.