Does contiguous tensor affect training result?

Hi, I have read a lot of thread regarding contiguos tensor, I know some may say that it affect the speed when forwarding, but I am curious, does it affect network training result? for e.g accuracy, loss, etc.

It may seem that when using torch.reshape, there is no need to use contiguous tensor right?

If an operation expects a contiguous tensor, it’ll make the memory contiguous explicitly since otherwise the indexing might be wrong which would influence the training in a bad way. :wink:

Yes, reshape combines a view and contiguous op. As described, PyTorch will either raise an error if a contiguous tensor is expected in an operation or call .contiguous internally.

1 Like

So, can I conclude that using torch.reshape is better than torch.view?

No, it’s not better, since you often don’t need the copy and could thus avoid it.

1 Like