Hi, I have read a lot of thread regarding contiguos tensor, I know some may say that it affect the speed when forwarding, but I am curious, does it affect network training result? for e.g accuracy, loss, etc.
It may seem that when using torch.reshape, there is no need to use contiguous tensor right?
If an operation expects a contiguous tensor, it’ll make the memory contiguous explicitly since otherwise the indexing might be wrong which would influence the training in a bad way.
Yes, reshape combines a view and contiguous op. As described, PyTorch will either raise an error if a contiguous tensor is expected in an operation or call .contiguous internally.