N,C,D,H,W = out.size() if N==1: out = out.squeeze(0).t().reshape(D,C,H,W) out = self.bn(out)
then I came error to use contiguous tensor. I resolved it by giving
self.bn layer, but found it strange behavior.
Did squeeze cause this, and is this really expected on my side to ensure it is contiguous?
out = out.transpose(1,2).reshape(N*D,C,H,W)
This also causes the error of non contiguous tensor so maybe it’s not squeeze! But shouldn’t it itself take care of it?