i have a multi class problem and i want to use MSE loss
i have weights, so the loss is:
out = (input-target)**2
out = out * weights.expand_as(out)
loss = out.sum(0)
but now the loss is a tensor with 61 length (i have 61 classes)
and i get error:
“grad can be implicitly created only for scalar outputs”
As an aside, you probably don’t want to use
MSELoss for a
out.sum(0) is summing over the batch dimension of your
target (even if your batch size is 1).
Given what you say, I speculate that
input (the output of your
model) is a vector of 61 values, one for each class (for a single
sample in your batch), and that
target is also a vector of 61
values (perhaps your class labels one-hot encoded).
If you want to do this (and your probably don’t), you should be
out.sum() to sum over all elements of the
that is, over both the batch and class dimensions.
you mean i should return loss.sum() ?
What I really mean is that you should reorganize your problem a
little bit and use
However, if you have a good reason to be using
MSELoss (and I’m
right that your
target have shape
then, yes, you should return
out.sum() (rather than
so that you will be summing over classes as well as the samples in