Making learnable variables

I am creating a custom loss function that uses two metric

def loss_fn(preds,target,a,b) :
          loss1 = loss_func1(preds,target)
          loss2 = loss_func2(preds,target)
          loss = a*loss1 + b*loss2
          return loss

How do i make variable a and b, in the above function, learnable in the training loop

You need to make two nn.Parameters for a and b (on the device nn.Paramter(torch.tensor(1.0, device=my_device))) or so and pass those to the optimizer (e.g. list(model.parameters()) + [a, b] instead of just the parameters).

Note that generally it is mathematically difficult to train loss weights, but whether this applies in your case is hard to tell at this level of detail.

Best regards

Thomas

so i just initialize the variables in the constructor of my network class and use them as netobject.a or do i have to write something in the backward function also?

Yeah, you could assign them to your own Loss module or something.
No, backward will be automatically.