Back propagate about leaf node

Afer tensor.detach() operation, the tensor will be a leaf node and the grad will not be backpropagated even if the tensor is the intermediate output.
Is this because the all leaf node wouldn’t backpropagate? Or other reasons?

1 Like


No it is because the .detach() breaks the link between the two. And so the backprop cannot go back.