Hello!
I’m writing custom loss function, which looks like this:
class CustomLoss(object):
def __init__(self, alpha, beta):
self.alpha = alpha
self.beta = beta
def calc(self, first, second, k):
u = torch.exp(-self.beta * F.pairwise_distance(torch.reshape(first, (first.size(0), -1)),
torch.reshape(second, (first.size(0), -1))))
D1 = torch.sum(u * torch.log((self.alpha * u) / ((self.alpha - 1) * u + k)))
D2 = k * torch.log((self.alpha * k) / ((self.alpha - 1) * k + u))
D2[torch.isnan(D2)] = 0
D2 = torch.sum(D2)
return D1 + D2
However, after the first optimizer.step() the loss is equal to nan. Can you tell me what the problem might be?
Thanks in advance!