What’s the appropriate way to create a copy of a tensor, where the copy requires grad when the original tensor did not in 0.4?
Previously, I was using something like
Variable(original_tensor, requires_grad=True). However, this was in 0.3 where
original_tensor was only a tensor (and not a variable).
Variable() seems to be on the way out, and I’d like to replace it with the appropriate approach in the new version. Should I be using some sort of
clone? Or will this result in the original also being updated to require grad as well? Or would something like
torch.tensor(original_tensor, requires_grad=True) (note, using new
Tensor) create a new tensor based off the old one but which is it’s own tensor and will not change the grad requirements of the original? Or would a different approach be recommended? Thank you!