When I experiment on language model example of
pytorch, the following use of
contiguous seems make no difference without it.
data = data.view(bsz, -1).t().contiguous()
I checked docs on
contiguous here, but the limited info still makes me question the necessity of use here.
I tried to find
tensor.contiguous in github, but couldn’t find it.
Could anyone point to the source code of
torch.Tensor.contiguous for me? Thanks a lot!
If you check
data.strides() you’ll see that they’re different after you call
contiguous(). It’s all about how the tensors are stored, you can find more details in the numpy docs.
Thanks @Veril @apaszke for your replies!
the source code on
nn.Countiguous is different from
I tried to use
tensor.stride but I don’t see how it helps to see the differences. @apaszke, could you make a simple code to demonstrate the use of
I found an example of
contiguous on this torch page, but I can’t translate it to pytorch. Will this help you make an example in pytorch?
Thanks a lot!
nn.Contiguous is part of the legacy package and unless you’re coming from Lua Torch you probably should never touch that code.
x = torch.randn(5, 4)
x.view(4, 5) # ok
x.t().view(4, 5) # fails
Thank you so much for your patience and help @apaszke and @Veril
x.t().view(4,5) should work, since the x.t() tensor is of size (4,5).
However, x.t().view(5,4) should give the contiguous error.
Thanks for the clarification.
I guess there is a minor typo in the second print: