Hello, I’ve tried to write a custom loss function, a Weighted Binary Cross Entropy. As suggested by @miguelvr here:

I’ve tried to use the following function (wrapped inside a class):

```
def weighted_binary_cross_entropy(output, target, weights=None):
if weights is not None:
assert len(weights) == 2
loss = weights[1] * (target * torch.log(output)) + \
weights[0] * ((1 - target) * torch.log(1 - output))
else:
loss = target * torch.log(output) + (1 - target) * torch.log(1 - output)
return torch.neg(torch.mean(loss))
```

The problem is that sometimes this outputs nan or -inf.

Tracing it back I reached the conclusion that sometimes my model outputs very small numbers, for example -136. This leads to:

`torch.sigmoid(torch.tensor([-136.])) -> tensor([0.])`

which leads to -inf in `torch.log()`

.

I am on pyTorch 1.0.1.

Is it ok to use `torch.clamp(torch.sigmoid(torch.tensor([-136.])), min=1e-8, max=1 - 1e-8)`

?