can you elaborate on the side effects you know?
- Break the computational graph
- Break the inplace correctness checks
- Allow untracked inplace changes
- Allow inplace metadata changes of a Tensor
- When using nn.Parameter, it alias with the
.data
in that class that refers to the underlying Tensor.
when is it good idea to still use data?
For 99.9% of the users: never.
The only case where it is at the moment (until we provide an API for it) is to go around the inplace correctness checks when they are too strict and you know exactly what you’re doing.
seems to me like
tensor.data.clone()
is slightly more efficient thantensor.detach().clone()
What do you mean by more efficient?
If you mean faster, then not really. Both do a shallow copy of the Tensor. .detach()
might be imperceptibly faster as it does not need to recreate inplace correctness tracking metadata.
For sending just the content to a new device: a = tensor.detach().to(device)
will do the trick.