Duplicated result when getting cloned Tensor`s data_ptr in list comprehension

python3.5, torch 1.3

In [1]: import torch                                                                                                                                                                         

In [2]: x = torch.randn(1, 2, 3).to("cuda:0")                                                                                                                                                

In [3]: y = torch.randn(1, 2, 4).to("cuda:0")                                                                                                                                                

In [4]: bindings = [x.clone().data_ptr(), y.clone().data_ptr()]                                                                                                                              

In [5]: bindings                                                                                                                                                                             
Out[5]: [140582636749824, 140582636749824]

In [6]: y.clone().data_ptr()                                                                                                                                                                 
Out[6]: 140582636749824

data_ptrs in bindings are same as y.clone().data_ptr()

and further more, clone will allocate a new data_ptr only after assigning cloned object to a var

how`s that happen?

In [1]: import torch                                                                                                                                                                         

In [2]: x = torch.randn(1, 2, 3).to("cuda:0")                                                                                                                                                
# ========> x.clone().data_ptr() would not change
In [3]: x.clone().data_ptr()                                                                                                                                                                 
Out[3]: 140563644940800

In [4]: x.clone().data_ptr()                                                                                                                                                                 
Out[4]: 140563644940800

# ========> but when assign cloned object to a var, x.clone().data_ptr() change
In [5]: a = x.clone()                                                                                                                                                                        

In [6]: a.data_ptr(), x.clone().data_ptr()                                                                                                                                                   
Out[6]: (140563644940800, 140563644941312)

# ========> assignment and data_ptr change 
In [7]: x.clone().data_ptr()                                                                                                                                                                 
Out[7]: 140563644941312

In [8]: b = x.clone()                                                                                                                                                                        

In [9]: b.data_ptr(), x.clone().data_ptr()                                                                                                                                                   
Out[9]: (140563644941312, 140563644941824)


This happens because we have a good caching allocator.
When you do x.clone().data_ptr(). You first allocate new memory for the clone, then extract the pointer and delete everything but the value of the pointer. Because cpython is refcounted. The cloned tensor is immediately deleted and so its memory is freed back to the allocator.
When you do y.clone().data_ptr() just after, the same memory space is now free and so the caching allocator returns this space to store the clone of y. Here again, the cloned Tensor will be deleted and only the value of the data pointer will be kept.

1 Like