We’ll provide a migration guide when 0.4.0 is officially released. Here are the answers to your questions:
-
tensor.detach()creates a tensor that shares storage withtensorthat does not require grad.tensor.clone()creates a copy of tensor that imitates the originaltensor'srequires_gradfield.
You should usedetach()when attempting to remove a tensor from a computation graph, andcloneas a way to copy the tensor while still keeping the copy as a part of the computation graph it came from. -
tensor.datareturns a new tensor that shares storage withtensor. However, it always hasrequires_grad=False(even if the originaltensorhadrequires_grad=True -
You should try not to call
tensor.datain 0.4.0. What are your use cases fortensor.data? -
tensor.clone()makes a copy oftensor.variable.clone()andvariable.detach()in 0.3.1 act the same astensor.clone()andtensor.detach()in 0.4.0.