[resolved] How does one create copy of tensors in PyTorch?

I was trying to create new copies (so that python was pointing at different objects and not just updating the same copy across multiple namespaces). I tried it by getting the actual numpy data from a tensor and using that to create a brand new tensor but that didn’t work. Why did it not work? How does one do this?

>>> y

 3  3
 3  3
 3  3
[torch.FloatTensor of size 3x2]

>>> yv.data = torch.FloatTensor( y.numpy() )
>>> yv
Variable containing:
 3  3
 3  3
 3  3
[torch.FloatTensor of size 3x2]

>>> y

 3  3
 3  3
 3  3
[torch.FloatTensor of size 3x2]

>>> yv.data.fill_(5)

 5  5
 5  5
 5  5
[torch.FloatTensor of size 3x2]

>>> yv
Variable containing:
 5  5
 5  5
 5  5
[torch.FloatTensor of size 3x2]

>>> y

 5  5
 5  5
 5  5
[torch.FloatTensor of size 3x2]
4 Likes

clone maybe?

need to check it out…


this seems to be Variables but I wanted just Tensor only…

2 Likes

ok that works:

>>> yv.data = y.clone()
>>> y

 5  5
 5  5
 5  5
[torch.FloatTensor of size 3x2]

>>> yv
Variable containing:
 5  5
 5  5
 5  5
[torch.FloatTensor of size 3x2]

>>> yv.data.fill_(10)

 10  10
 10  10
 10  10
[torch.FloatTensor of size 3x2]

>>> y

 5  5
 5  5
 5  5
[torch.FloatTensor of size 3x2]

>>> yv
Variable containing:
 10  10
 10  10
 10  10
[torch.FloatTensor of size 3x2]
10 Likes

Note that indexing with a (non-zero dimensional) tensor also results in a copy.

(for zero-dim tensors see this merged commit)

You should also detach the tensor from the computational graph if it requires grad else gradients will be calculated for both that may lead to OOM issues.

x = torch.tensor([1.,2.,3.], requires_grad=True)
y = x.detach().clone()
y.requires_grad = True