Creating a copy of a tensor where the copy requires_grad in 0.4?

What’s the appropriate way to create a copy of a tensor, where the copy requires grad when the original tensor did not in 0.4?

Previously, I was using something like Variable(original_tensor, requires_grad=True). However, this was in 0.3 where original_tensor was only a tensor (and not a variable). Variable() seems to be on the way out, and I’d like to replace it with the appropriate approach in the new version. Should I be using some sort of clone? Or will this result in the original also being updated to require grad as well? Or would something like torch.tensor(original_tensor, requires_grad=True) (note, using new tensor not Tensor) create a new tensor based off the old one but which is it’s own tensor and will not change the grad requirements of the original? Or would a different approach be recommended? Thank you!

It appears torch.tensor(original_tensor, requires_grad=True), works and does not change the original tensor. So it would seem to be the best solution to me, unless someone points out something I’m missing, or there’s a way that’s better.

1 Like

torch.tensor(original_tensor, requires_grad=True) will create a copy of the data as well as the tensor object.

If you just want an alias that shares the same data but has requires_grad=True you can use:

orig = torch.randn(3, 3)
x = orig.detach().requires_grad_()
9 Likes

Thank you. This is definitely important to note that there’s the option to share the data with a separate alias, because I hadn’t thought of that, but it’s actually what I want.