Question about torch.clone()

As I know, the operation of clone is keeping track of the original history of operations and passing the gradient to the original one. So I’m confused at what time should I use clone() because I think that cloned tensor will have the effect (i.e. the gradients pass to the original tensor are equal) as the original one after the net backward?

While .clone() is differentiable and the gradient will thus be forwarded to the original tensor, e.g. inplace operations won’t change the original tensor.