Pytorch: why a.storage() is b.storage() return False when a and b reference the same data?

>>> a = torch.arange(12).reshape(2, 6)
>>> a
tensor([[ 0,  1,  2,  3,  4,  5],
        [ 6,  7,  8,  9, 10, 11]])
>>> b = a[1:3, :]
>>> b.storage() is a.storage()
False
# but
>>> b[0, 0] = 999
>>> b, a # both tensors are changed
(tensor([[999,   7,   8,   9,  10,  11]]),
 tensor([[  0,   1,   2,   3,   4,   5],
         [999,   7,   8,   9,  10,  11]]))

What is exactly the objects that stores tensor data? How can I make check if 2 tensors share memory?

Hi,

The storage objects is not really a thing in python. We just provide these as a thin wrapper to be able to access attributes on them from python.
What happens is that every time to ask for it, you get a new python objects and so the is is always going to be False:

s = a.storage()
print(a is a.storage()) # Prints False

You can compare the .data_ptr() values in the storage to check this though. The two should be equal

1 Like