Contiguous() and permute()

I am trying t understand how contiguous() and permute() work in pytorch, but im not smart enough to get it.
Can anyone please provide explanations and example to make it clear for me?
Thanks

1 Like

Use stride and size to track the memory layout:

a = torch.randn(3, 4, 5)
b = a.permute(1, 2, 0)
c = b.contiguous()
d = a.contiguous()
# a has "standard layout" (also known as C layout in numpy) descending strides, and no memory gaps (stride(i-1) == size(i)*stride(i))
print (a.shape, a.stride(), a.data_ptr())
# b has same storage as a (data_ptr), but has the strides and sizes swapped around
print (b.shape, b.stride(), b.data_ptr())
# c is in new storage, where it has been arranged in standard layout (which is "contiguous")
print (c.shape, c.stride(), c.data_ptr())
# d is exactly as a, as a was contiguous all along
print (d.shape, d.stride(), d.data_ptr())

Best regards

Thomas

4 Likes