Is permute() lazy?

I am using permute() and reshape() to pass data to a library/module which by default operates on the last two dimensions. For performance reasons I am wondering whether this will involve data movement?

Or does PyTorch rather keep track of how to index into the data and just change that?

I believe that by permute is lazy in that sense as it returns a view rather than a copied tensor. However, this also means that the result isn’t guaranteed to be contiguous:

>>> import torch
>>> a = torch.randn(2, 3, 224, 224)
>>> a.data_ptr()
140712810848320
>>> a.is_contiguous()
True
>>> a = a.permute(0, 2, 3, 1)
>>> a.data_ptr()
140712810848320
>>> a.is_contiguous()
False
>>> 

1 Like