I used function id() to check the reference of an element of a torch tensor and found it was changing all the time. But if I check the id() of an element in a list, it is stable. I just wonder why is this?
The code is below:
…: a = torch.tensor([0,1,2,3])
…: b = [0,1,2,3]
You’re querying python objects, and these are basically created for everything. List indexing is a built-in operation that avoids object creation or copying, so it is a bit special. And tensors are wrappers, sliced with code in
The stable thing is the memory address of tensor’s storage (that’s technically a number exported from c++ part): a.storage().data_ptr()
Thanks a lot for your answer! The function storage().data_ptr() helps a lot for me. But I am still confused about the function id(). If it does not returns the memory address, does it return a number for referencing the corresponding object? Then why is it unstable for elements of a tensor a? Is the tensor a constantly changing its reference to its elements?
You’re using id() on wrapper objects, that are created on the fly. E.g. when you do a you get a new object that shares storage with “a”, but also records storage offset. Repeated call a creates a new wrapper, as it is impossible to cache these.
I understant, Thank you so much!