How to add new loss functions

I already defined a loss function in pytorch, but there is an error that I could not find solution. Here is my code:

<code>
class cust_loss(torch.nn.Module):
    def __init__(self):
        super(cust_loss, self).__init__()

def forward(self, input, target):
    predicted_labels = torch.max(input, 1)[1]
    minus = torch.max(input, 1)[1] - target
    cust_distance = torch.sum(minus*minus).type(torch.FloatTensor)/predicted_labels.size()[0]
    return cust_distance


   
######## within main function ######

criterion = cust_loss()#nn.CrossEntropyLoss()        
Optimizer = optim.SGD(filter(lambda p: p.requires_grad, model_conv.parameters()), lr=1e-3, momentum=0.9)
  
loss = criterion(inputs, labels)
loss.backward()
Unfortunately, I got this error:
Traceback (most recent call last):
  File "/home/morteza/PycharmProjects/transfer_learning/test_SkinDetection.py", line 250, in <module>
    main(True)
  File "/home/morteza/PycharmProjects/transfer_learning/test_SkinDetection.py", line 130, in main
    loss.backward()
  File "/home/morteza/anaconda3/lib/python3.6/site-packages/torch/autograd/variable.py", line 156, in backward
    torch.autograd.backward(self, gradient, retain_graph, create_graph, retain_variables)
  File "/home/morteza/anaconda3/lib/python3.6/site-packages/torch/autograd/__init__.py", line 98, in backward
    variables, grad_variables, retain_graph)
  File "/home/morteza/anaconda3/lib/python3.6/site-packages/torch/autograd/function.py", line 91, in apply
    return self._forward_cls.backward(self, *args)
  File "/home/morteza/anaconda3/lib/python3.6/site-packages/torch/autograd/_functions/basic_ops.py", line 38, in backward
    return maybe_unexpand(grad_output, ctx.a_size), maybe_unexpand_or_view(grad_output.neg(), ctx.b_size), None
  File "/home/morteza/anaconda3/lib/python3.6/site-packages/torch/autograd/variable.py", line 381, in neg
    return Negate.apply(self)
  File "/home/morteza/anaconda3/lib/python3.6/site-packages/torch/autograd/_functions/basic_ops.py", line 224, in forward
    return i.neg()
 AttributeError: 'torch.LongTensor' object has no attribute 'neg'

I could not solve it. I traced the code and compared it with a code that is error free, but I could not solve it. Moreover, I defined my inputs and labels as Variable with “requires_grad=True” parameter.
Please guide me how to solve it.
Thank you.

when autograd computes the derivative wrt your variable predicted_labels (derivative of 1/x is -1/x^2) it uses .neg() for the “-”. But predicted_labels is a LongTensor (argmax), for with neg is not implemented.

Try:

predicted_labels = torch.max(input, 1)[1].float()
2 Likes

Thank you very much for your reply and hint. I could find the problem.
I don’t know the reasons, it seems the problems is due to LongTensor type which must be changed to FloatTensor. In this regard, all lines must be FloatTensor type. So I changed my forward function in cust_loss class as follow and it worked.

   def forward(self, input, target):
        predicted_labels = torch.max(input, 1)[1].float()
        minus = predicted_labels - target.float()
        self.cust_distance = torch.sum(minus*minus).type(torch.FloatTensor)/predicted_labels.size()[0]
        return self.cust_distance