Id function of tensor changes all the time

I used function id() to check the reference of an element of a torch tensor and found it was changing all the time. But if I check the id() of an element in a list, it is stable. I just wonder why is this?

The code is below:

import torch
…: a = torch.tensor([0,1,2,3])
…: b = [0,1,2,3]
Out[16]: 140023032068568
Out[17]: 140023032095656
Out[18]: 140023032069288
Out[19]: 94672250583712
Out[20]: 94672250583712
Out[21]: 94672250583712
Out[22]: 140024314842496
Out[23]: 140024314842496
Out[24]: 140024314842496

You’re querying python objects, and these are basically created for everything. List indexing is a built-in operation that avoids object creation or copying, so it is a bit special. And tensors are wrappers, sliced with code in __getitem__.

The stable thing is the memory address of tensor’s storage (that’s technically a number exported from c++ part):

Thanks a lot for your answer! The function storage().data_ptr() helps a lot for me. But I am still confused about the function id(). If it does not returns the memory address, does it return a number for referencing the corresponding object? Then why is it unstable for elements of a tensor a? Is the tensor a constantly changing its reference to its elements?

You’re using id() on wrapper objects, that are created on the fly. E.g. when you do a[100000] you get a new object that shares storage with “a”, but also records storage offset. Repeated call a[100000] creates a new wrapper, as it is impossible to cache these.

I understant, Thank you so much!