python3.5, torch 1.3
In [1]: import torch
In [2]: x = torch.randn(1, 2, 3).to("cuda:0")
In [3]: y = torch.randn(1, 2, 4).to("cuda:0")
In [4]: bindings = [x.clone().data_ptr(), y.clone().data_ptr()]
In [5]: bindings
Out[5]: [140582636749824, 140582636749824]
In [6]: y.clone().data_ptr()
Out[6]: 140582636749824
data_ptrs in bindings are same as y.clone().data_ptr()
and further more, clone will allocate a new data_ptr only after assigning cloned object to a var
how`s that happen?
In [1]: import torch
In [2]: x = torch.randn(1, 2, 3).to("cuda:0")
# ========> x.clone().data_ptr() would not change
In [3]: x.clone().data_ptr()
Out[3]: 140563644940800
In [4]: x.clone().data_ptr()
Out[4]: 140563644940800
# ========> but when assign cloned object to a var, x.clone().data_ptr() change
In [5]: a = x.clone()
In [6]: a.data_ptr(), x.clone().data_ptr()
Out[6]: (140563644940800, 140563644941312)
# ========> assignment and data_ptr change
In [7]: x.clone().data_ptr()
Out[7]: 140563644941312
In [8]: b = x.clone()
In [9]: b.data_ptr(), x.clone().data_ptr()
Out[9]: (140563644941312, 140563644941824)