Costume loss function for multi-label

hi, i have a multi label problem, and i built a loss:

def loss(p, y):
fa= -np.dot(1-y), list(np.log(1-,p))
l = (y * list(np.log§)
prod = (1-y) - l
loss = fa+ np.min(prod)
return loss

the problem is, that this function gets 61 length prediction vector (prec for each class), and 61 length label vector

what i dont understand is:

  1. how can i turn it into cosume loss that actually works with the code? (backpro. et…)
  2. how can i handle with a list of all the prediction and the labels in the bacth with this function?

Could you explain, why you need a list in the code?
Would it be possible to use a tensor instead?

Regarding point 1: if you can write all your operations using PyTorch methods, Autograd will be able to calculate the gradients in the backward call automatically.

i ment to tensor, not list, im sorry
the question actually was:
when i write my own loss function (the one up here), and the loss gets a tensor of predictions and labels each time, should i sum all the distances?

final_loss = loss(PRED1,LABEL1)+loss(PRED2,LABEL2)+…+loss(PREDn,LABELn)
final_loss.backward()
@ptrblck

You could sum them (and take the average afterwards) and it depends on your use case.

If you are not sure, if each loss creates valid gradients, you could check its .loss_fn.
If it’s some function other than a None value, it should work.

Do you see any errors using this approach?