Build your own loss function in PyTorch

Thanks a lot!

I rewrote everything using Torch, so now, it should work if I use loss.backward(X, y)?

Advises about writing my own autograd function and/or computing the similarity more efficiently are very welcome. It is definitely something that I need to do later, but for now I need just a simple version of this working.

Edit - It looks that it works by rewriting the final function as:

def customized_loss(X, y):
    X_similarity = Variable(similarity_matrix(X), requires_grad = True)
    association = Variable(convert_y(y), requires_grad = True)
    temp = torch.mul(X_similarity, association)
    loss_num = torch.sum(torch.mul(X_similarity, association))
    loss_all = torch.sum(X_similarity)
    loss_denum = loss_all - loss_num
    loss = loss_num/loss_denum
    return loss

All is good for now, thanks again!