Define loss function

I’m trying to customize a loss function which include a weight matrix.
I’d like to use it in this way:
I’m doing a work of 8-class classification.The output of my model with is a batch of 8-dimension tensor and each value of the tensor is the probability.My labels are another batch of one-hot vectors. I’d like to define my loss function like output.dot(weight_matrix.dot(labels.T)).I did it in a normal way but I got errors telling me that object has no attribute ‘backward’.I have no idea about how to deaal with ‘backward’.

Can anyone give me some advice?

Thanks in advance!

Here’s my code↓

class emtional_damage(nn.Module):
    def __init__(self, emo_mat, predict, label):
        super(emtional_damage).__init__()
        self.emo_mat = emo_mat
        self.predict = predict
        self.label = label

    def forward(self):
        loss0 = torch.mm(self.predict, self.emo_mat)
        loss = torch.mm(loss0, self.label.t())
        loss = loss.sum()
        return loss

I guess you don’t need to define class to do that.
How about define function instead of class?

def custom_loss(emo_mat, predict, label):
    loss0 = torch.mm(predict, emo_mat)
    loss = torch.mm(loss0, label.t())
    loss = loss.sum()
    return loss

You can use it as below,

model = YourModel()
out = model(input)
loss = custom_loss(emo_mat, out, label)
...

You don’t need a class for this. However, if you really want to use class then

class emtional_damage(nn.Module):
    def __init__(self):
        super(emtional_damage).__init__()

    def forward(self, emo_mat, predict, label):
        loss0 = torch.mm(predict, emo_mat)
        loss = torch.mm(loss0, label.t())
        loss = loss.sum()
        return loss

You can use it as follow.

L = emtional_damage()
.....
loss = L(emo, pred, labes)