Gaussian Loss and variance resize

Hello, I’ m using the Gaussian loss this way:

        varTrain = torch.ones(224, 1, requires_grad=True)
        varTrain = torch.reshape(varTrain, (32, 7))
        loss = loss_mse(recon_batch, XTrain, mu, logvar) + loss_gauss(predictions, YTrain.view(-1, 1), 
                                    varTrain)

where predictions is [32, 7], YTrain represents the true labels (0,1,2,…), and varTrain should have the same size of predictions. The code only works at the first epoch, then I get the following error:

        ValueError: var is of incorrect size

How can I fix it? Thank you in advance.

From the docs:

Var: (N, *) or (*), same shape as the input, or same shape as the input but with one dimension equal to 1, or same shape as the input but with one fewer dimension (to allow for broadcasting)

Check the shape of varTrain and make sure it meets these requirements.

Actually, the dimension of varTrain is [32, 7] as the dimension of predictions, indeed, the code seems to work until the first epoch. After that I get the error and I cannot understand the reason. Any suggestion?

The shape seems to change after the first epoch, so check why that’s the case.

Yes, I noticed it but really don’t know why. I think to have done as the example in the doc does.