Where is the source code for `torch.Tensor.contiguous()`?


(dl4daniel) #1

When I experiment on language model example of pytorch, the following use of contiguous seems make no difference without it.

data = data.view(bsz, -1).t().contiguous()

I checked docs on contiguous here, but the limited info still makes me question the necessity of use here.

I tried to find tensor.contiguous in github, but couldn’t find it.

Could anyone point to the source code of torch.Tensor.contiguous for me? Thanks a lot!


Call contiguous() after every permute() call?
(Adam Paszke) #3

If you check data.strides() you’ll see that they’re different after you call contiguous(). It’s all about how the tensors are stored, you can find more details in the numpy docs.


(dl4daniel) #4

Thanks @Veril @apaszke for your replies!

the source code on nn.Countiguous is different from tensor.contiguous right?

I tried to use tensor.stride but I don’t see how it helps to see the differences. @apaszke, could you make a simple code to demonstrate the use of contiguous?

I found an example of contiguous on this torch page, but I can’t translate it to pytorch. Will this help you make an example in pytorch?

Thanks a lot!


(Adam Paszke) #6

nn.Contiguous is part of the legacy package and unless you’re coming from Lua Torch you probably should never touch that code.

x = torch.randn(5, 4)
print(x.stride(), x.is_contiguous())
print(x.t().strinde(), x.t().is_contiguous())
x.view(4, 5) # ok
x.t().view(4, 5) # fails

(dl4daniel) #7

Thank you so much for your patience and help @apaszke and @Veril