Hi,
I’ve Implemented the following loss function. But noted on my last training that this is the reason for my loss to be NaN. When I changed the loss function to a hard triplet margin loss the network started training with no issue. Would you please help me point where can this loss be wrong?
class SoftMarginRankingLoss(torch.nn.Module):
def __init__(self, weight=None, size_average=True, alpha =1 ):
super(SoftMarginRankingLoss, self).__init__()
self.apha = alpha
def forward(self, anchor, positive_match, negative_match):
# Calculate the euclidean distance between anchor and positive
distance_pos = torch.square(F.pairwise_distance(anchor, positive_match, keepdim = True))
# Calculate the euclidean distance between anchor and negative
distance_neg = torch.square(F.pairwise_distance(anchor, negative_match, keepdim = True))
distance = distance_pos - distance_neg
loss = torch.log(1+torch.exp(self.apha*distance))
# mean is required as the backward() function expects a scalar value for the loss
return loss.mean()