Create a new tensor with same value?

Hi, for pytorch < 0.4, we can easily do:

a = Variable(b.data, requires_grad=True)

However, since .data is not recommeneded in 0.4, we can

a = b.detach()
a.requires_grad_()
a = b.detach().clone()
a.requires_grad_()
a = b.clone().detach()
a.requires_grad_()

I wonder what’s the difference among 1, 2, 3.

1 Like

detach() will create a new tensor so there is no need to clone() after or before detach(). They are all the same.

Yes only detach() is not right, the document has it.

That’s not entirely true since the detached tensor shares the storage with the original one. If you want to modify them independent from each other you have to clone them. Saying that, approaches 2 and 3 are indeed equivalent, but in approach 1 a would still share the same storage with b which is not the case in the other approaches due to the clone() operation

1 Like

Yes, only detach() will have potential bugs.

Hi,

If what you want is exactly the old a = Variable(b.data, requires_grad=True) then the equivalent is a = b.detach().requires_grad_(). In both cases, a and b share the same storage and won’t use extra memory. The new version is better though because the autograd engine will properly detect if you do inplace operations on a while b's original value was needed for something else. So the gradients will be computed properly and if it can’t be done, it will raise an error.

If such error occurs, then you will need to add a clone to make sure that you won’t change b by side effect. The best way to do it I think is a = b.detach().clone().requires_grad_().