Compute Loss from External Precomputed Loss Value

I am trying to backpropagate the error using a value computed externally by a heuristic. The custom loss function is the following.

The problem is, when lb (lambda) equals to zero, all the gradients are zero after the backward( ) loss call.

When lambda is a value > 0 (such as 0.001), the weight gradients will be different from zero.

Any idea ?

Best regards.

class FCLayerCustomLoss(torch.nn.Module):
    def __init__(self, model, lb = 0.001):
        super(FCLayerCustomLoss, self).__init__()
        self.model = model = lb 

    def forward(self, score):
        dense0_params =[x.view(-1) for x in self.model.dense0.parameters()]))
        l0_regularization = * torch.norm(dense0_params, p=2)      
        return score + l0_regularization