Hi,
Does the .contiguous()
function creates a copy of the memory corresponding to the tensor’s data?
This was mentioned by @ptrblck here: Contigious vs non-contigious tensor but I would like to know if anyone is sure about this.
Thank you in advance for your help.
Samuel
If you call contiguous()
on a non-contiguous tensor, a copy will be performed.
Otherwise it will be a no-op.
You could add some print statements to the linked example code and will see an increased memory usage after the y.contiguous()
call.
7 Likes
111284
(Peng)
April 10, 2020, 3:13pm
3
I checked the id of x and y in your example, although they have different ids, but when I compare them the result show they’re the same.
x = torch.arange(12).view(4, 3)
y=x.t()
y = y.contiguous()
print(id(x.storage()))
print(id(y.storage()))
print(id(x.storage())==id(y.storage()))
> 4495471152
> 4495376464
> True
Then I further check each element in x and y manually, and I found the id of same element in x and y is identical. For example:
id(x.storage()[0])
> 4336047184
id(y.storage()[0])
> 4336047184
id(x.storage()[3]) # x.storage()[3] == 3
> 4336047280
id(y.storage()[1]) # y.storage()[1] == 3
> 4336047280
It to me seems like the contiguous doesn’t create a copy of the memory, but create a new “reference” to the same memory. Since I don’t know much about memory management, I would like to know if my guess is to some extent right.
Thank you in advance.
I think .storage()
will return a new reference and cannot be used to compare the location of the data:
x = torch.randn(1)
print(id(x.storage()))
> 3001145106760
print(id(x.storage()))
> 3001133757320
print(id(x.storage()))
> 3001171709448
print(id(x.storage()))
> 3001576998536
You could try to compare the .data_ptr()
instead.
3 Likes
111284
(Peng)
April 14, 2020, 8:01am
5
Thank you for your answer!