Custom Loss: Loss Function with weight parameters


I wanted to learn whether it makes sense or possible to create a costume loss that has weights, bias nn.Parameters. How will autograd handle this? Will those weights learned in conjunction with the model?

This is a dummy example it’s not the actual loss but I try to explain what I want to do below.

class MyCustomLoss(nn.Module):
    def __init__(self,  in_features, out_features):
        self.custom_layer = Custome(in_features, out_features)

    def forward(self, inputs, targets):
        return F.cross_entropy(self.custom_layer(inputs, targets), targets) 

This is just a dummy example. My layer is a layer that has weights but also needs the targets values in the forward pass. Hence, why I want to place it in my loss function as opposed to the model. I have successfully implemented it by changing my model to also pass the target parameter but that seems ugly and not modular since I will have to manually change the forward pass of every model I want to use with this loss function.

I would prefer a model who outputs the features and then handle the custom layer and loss inside the custom loss. In the end, I just want a MyCustomLoss that has a layer with learnable weights/parameters inside of it and then calls cross_entropy on the output of that layer.

Your code should be alright.
Basically the computation graph will be built in the same manner regardless if you put your custom layer inside the model or the loss function. As long as the computations stay the same, you’ll get the same result.

Do you encounter any issues with your approach?

I will try and confirm here if it works. First I wanted to make sure it was theoretically possible to put weights on the Loss function and the model will still keep track and update those weight during the backward step.