We’ll provide a migration guide when 0.4.0 is officially released. Here are the answers to your questions:
-
tensor.detach()
creates a tensor that shares storage withtensor
that does not require grad.tensor.clone()
creates a copy of tensor that imitates the originaltensor
'srequires_grad
field.
You should usedetach()
when attempting to remove a tensor from a computation graph, andclone
as a way to copy the tensor while still keeping the copy as a part of the computation graph it came from. -
tensor.data
returns a new tensor that shares storage withtensor
. However, it always hasrequires_grad=False
(even if the originaltensor
hadrequires_grad=True
-
You should try not to call
tensor.data
in 0.4.0. What are your use cases fortensor.data
? -
tensor.clone()
makes a copy oftensor
.variable.clone()
andvariable.detach()
in 0.3.1 act the same astensor.clone()
andtensor.detach()
in 0.4.0.