Issue tensor dimension


I have an annoying issue that I can’t handle.

Here is a reproducible code in colab :

It appears in the Running train-loop session,

First of all I have this line :

1/5 * Epoch (train): 0% 0/1 , which is quite weird, something like 0/1.

Then I have the classical issue : The size of tensor a must match the size of tensor b at non-singleton dimension 4 , that I can’t understand as in pytorch everything is moreor less hidden, I don’t know where it could come from.

I’m quite new in Pytorch, may you have an idea ?


I didn’t have a look at the notebook. But this is a common error. You should check the input dimensions of the data and the dimension of the input your network expects. A good way to debug this is to print the dimensions of the tensor as training progresses, if the dimension change in the way, you’ll know what to do.

Yes, I think this is a dimension tensor problem, but as I’m very new I have no idea how to do it, and then if there is a change, how to reshape, but very kind of you, thanks a lot

Could you just paste a minimal snipped with tensors generated by torch.random(…) ? So that we can easily run it

Just post a snippet regarding your problem here. That’d be helpful.

Everything is in colab, I think this is a reproducible exemple/error, I can’t see what are you expecting, sorry.

Indeed, but it’s not the same debugging in colab than debugging in a local machine with a proper IDE.
It’s also a waste of time to keep reading code which is not related to the problem, to wait for colab to initialize and download packages and a long list of etceteras.

If there’s a problem in a network, the simplest case is to provide something like:

class Mynet(nn.Module):

model= Mynet(...)
output =model(torch.rand(A,B,C))

So that copying those few lines of code we can reproduce it not paying attention to a bigger bunch of code.