Does squeeze cause tensor to become non contiguous?

N,C,D,H,W = out.size()
if N==1:
    out = out.squeeze(0).t().reshape(D,C,H,W)
out = self.bn(out)

then I came error to use contiguous tensor. I resolved it by giving out.contiguous() to self.bn layer, but found it strange behavior.

Did squeeze cause this, and is this really expected on my side to ensure it is contiguous?

UPDATE:

out = out.transpose(1,2).reshape(N*D,C,H,W)

This also causes the error of non contiguous tensor so maybe it’s not squeeze! But shouldn’t it itself take care of it?

After some searching also found this issue, and it clears it properly!