Manual Calculation of Binary Cross Entropy with logits

In order to ensure that I understood how BCE with logits loss works in pytorch, I tried to manually calculate the loss, however I cannot reconcile my manual calculation with the loss generated by the pytorch function F.binary_cross_entropy_with_logits.

can somebody please explain what i am doing wrong. My code is below:

import torch
import torch.nn.functional as F
import math

def sigmoid(x):
return 1/(1+math.exp(-x))

def manual_cross_entropy_with_logits(inp, target):

y=target[0]
x=inp[0]
p1=y*(math.log(sigmoid(x),2))
p0=(1-y)*math.log(1-sigmoid(x),2)
return -1 * (p1 + p0)

inp = torch.randn(1, requires_grad=True)
target = torch.empty(1).random_(2)

loss = F.binary_cross_entropy_with_logits(inp, target)
lossman=manual_cross_entropy_with_logits(inp,target)
print(‘bce loss’)
print(loss)
print(‘manual loss’)
print(lossman)

Hi Arif!

You need to use the natural logarithm (log-base-e) in your
cross-entropy formula. You are using log-base-2. Remove
the second argument in your calls to python’s math.log().
Thus:

    p1=y*(math.log(sigmoid(x)))
    p0=(1-y)*math.log(1-sigmoid(x))

As a side note, calculating log() and sigmoid() separately
can amplify numerical errors. You can avoid this by using
something like pytorch’s logsigmoid() that uses a better
algorithm internally.

Good luck!

K. Frank

2 Likes

Sorry to bump this thread, but how do I implement log(1 - sigmoid(x)) with logsigmoid?

Hi Eliphat!

sigmoid (-x) = 1 - sigmoid (x), so you can compute logsigmoid (-x).

This has the same numerical benefits when x becomes large and positive
as logsigmoid (x) has when x becomes large and negative.

Best.

K. Frank

1 Like