Wrong initial loss

Hi, I want to check the initial loss of my network using the following codes:

criterion = nn.MSELoss()
input = Variable(torch.rand(1,1,27,27,27))
label = Variable(torch.rand(1,1,27,27,27))
net = YLNet3D(1,9)
output = net(input)
loss = criterion(output,label)

print loss

But the loss is not what it’s supposed to be (my num_classes is 9, so should be 2.197. What might be the reasons for that? thanks!

class MySoftmax(nn.Module):

def forward(self, input_):
    batch_size = input_.size()[0]
    output_ = torch.stack([Funct.softmax(input_[i]) for i in range(batch_size)], 0)
    return output_

Could you provide a working script? (or just the value of the target and output). I’m not sure what the YLNet3D is.

By default MSELoss does an average (see docs here). Maybe you were not expecting the averaging behavior?

Hi, thank you for the reply!

Actually, I want to classify 9 classes, and my inputs are 5D: Batch x channels x L x W x H, do you have any suggestions for which loss function to use? thanks!!

Oh so do you mean that if I’m not using other loss function than the cross entropy one I should not expect the initial loss to be -ln(1/num_classes)? Sorry I’m totally a newbie.