Does wrapping data in Variables or FloatTensors create copies?

if I have a variable x and I wrap it in a pytorch structure, does it duplicate the data? As in:

x = torch.FloatTensor(x) #one copy?
xv = Variable( torch.FloatTensor(x)) #second copy?!

we have now 3 copies? one in numpy and 2 in pytorch?

Hi,

Going from tensor to/from numpy array does not create any copy.
Wrapping tensor in a Variable does not create a copy.

1 Like