tks for reply,
I try use your code with multi task loss for 2 taskes, but my model just try change parameter of sigma but loss just keep value. I don’t know what wrong with my code.
I dont write seperate loss class, I write loss function like code below.
class ModelTTestMultitask(nn.Module):
def __init__(self, cf):
self.sigma = nn.Parameter(torch.ones(2))
def forward():
""
do_something()
def loss(self, batch):
loss_1 = self.output_layer_task_1.loss(batch.label_task_1)
loss_2 = self.output_layer_task_2.loss(batch.label_task_2)
loss_combine = 0.5 * torch.Tensor([loss_1, loss_2]) / self.sigma ** 2
loss_combine = loss_combine.sum() + torch.log(self.sigma.prod())
return loss_combine
If I don’t use loss with uncertainty like this and my loss function like below, everything can work normal, but loss of each task scale another, and final model not good as i hope.
def loss(self, batch):
loss_1 = self.output_layer_task_1.loss(batch.label_task_1)
loss_2 = self.output_layer_task_2.loss(batch.label_task_2)
loss_combine = 0.5 * (loss_1 + loss_2)
return loss_combine