Custom loss function

Hey! I have already to define my own loss function. It does work. But I am not sure whether it is correct because I don’t define the backward(). I am not sure whether I need to define backward(), and I don’t know how to define backward().

class _Loss(nn.Module):
    def __init__(self, size_average=True):
        super(_Loss, self).__init__()
        self.size_average = size_average
class MyLoss(_Loss):
    def forward(self, input, target):
        loss = 0
        weight = np.zeros((BATCH_SIZE,BATCH_SIZE))
        for a in range(BATCH_SIZE):
            for b in range(BATCH_SIZE):
                weight[a][b] = Censor([a][0])
        for i in range(BATCH_SIZE):
            for j in range(BATCH_SIZE):
                a_ij= (input[i]-input[j]-target[i]+target[j])*weight[i,j]
                loss += F.relu(a_ij)
        return loss

Sorry but I can’t read your code. Try format your code using

your code here

Every op in the forward function need to be Variable
so it need to be


and maybe you shall not use for loop, try vectorize your code.

I am sorry for inconvenience.
Thanks for your reminder. This is my first time writing codes. So I do not know how to write it.

Hi, @11127

You do not need to define a class for your loss. A function is sufficient and the backward() comes for free (i.e. you do not need to write any code for it).