I am really confused when and why to use clone() method. The official documentation for creating a custom autograd function makes use of clone in the backward pass link. Why is that needed? What happens if we do not use a clone because the computation graph is not affected by the clone method. Any help is highly appreciated
You should use
clone() to get a new Tensor with the same value but that is backed by new memory.
The tutorial uses it because it later modifies the Tensor inplace and it is forbidden to modify the gradient given to you inplace. So it first clone it to get new memory. Then the inplace change won’t break that rule.