If you check data.strides() you’ll see that they’re different after you call contiguous(). It’s all about how the tensors are stored, you can find more details in the numpy docs.
the source code on nn.Countiguous is different from tensor.contiguous right?
I tried to use tensor.stride but I don’t see how it helps to see the differences. @apaszke, could you make a simple code to demonstrate the use of contiguous?
I found an example of contiguous on this torch page, but I can’t translate it to pytorch. Will this help you make an example in pytorch?