Row normalized custom loss function


I am trying to implement a regression loss function where I have multiple outputs but I want the loss function to be something like sum(Y_prediction-Y_target)/max(Y_target),

so far I have:

def my_loss(output, target):

maxes = torch.max(target, dim = 1)

diff = (output - target)
diff_sums = torch.sum(diff,dim = 1)

loss = torch.div(diff_sums,maxes)
return loss

but I am getting error:

div(): argument ‘other’ (position 2) must be Tensor, not torch.return_types.max

Take a look on what torch.max() returns when 2 arguments is passed :
torch.max(input, dim, keepdim=False, *, out=None) -> (Tensor, LongTensor)

So, I think your function is right, except first line of code, it should be :

maxes, _ = torch.max(target, dim=1)

thank you! So I made this change and I believe it worked, but was not very effective of a loss function unfortunately. thank you though!