Nan loss for Weighted binary cross entropy

Hi there,

I have got a classification problem with following description.

A deep neural network with output shape:
Output has size: batch_size*19*19*5
Target has size: batch_size*19*19*5

Output tensor has values between [-inf,+inf] and the target tensor has binary values (zero or one). My task is a binary classification problem. Actually, each element of the output tensor is a classifier output. I would like to use torch.nn.functional.binary_cross_entropy for optimization.
I have wrote bellow code for Loss function:
F.binary_cross_entropy_with_logits(output, target).

According to my analysis, I found that the number of samples are not fairly equal. So I decide to use weighted loss function instead of simple one. Actually I would like to use weighted binary cross entropy. Although, I know the binary cross entropy has the input attribute weight, but i think it is not practical for my above mentioned purpose. Could you please tell me, how can I do this in pytorch?

[edited]
I did:

def wbce(output, target, weight):
     # output size = batch_size*19*19*5
     # target size = batch_size*19*19*5
     # weight size = batch_size*19*19*5
     return torch.mean(-1*weight*(target*torch.log(F.sigmoid(output) + (1-target)*torch.log(1-sigmoid(output)))

But I think, there is something wrong with that. Could you please help me to find any possible errors?

Thanks :slight_smile:

The cross-entropy has a (resolvable) singularity at 0 and 1 (infinity Ɨ 0 or so). The built-in fumction regularizes. You could clamp the score to achieve something similar.

Best regards

Thomas

Hi Thomas,
Thanks for your response! Could you please tell me how? I mean in my above code.

Iā€™d try

output = output.clamp (min=-10, max=10)

at the top of the function. No science for the 10, whatever works and looks large enough for you.

Best regards

Thomas

2 Likes

Let me check it! Thanks for your response.