my loss value is
loss_iden_12t = IdentLoss(l1[0], l1[1], l1[2])
loss_iden_rcs1s1t = IdentLoss(l2[0], l2[1], l2[2])
loss_iden_rcs2s2t = IdentLoss(l3[0], l3[1], l3[2])
loss_identy = loss_iden_12t.item() + loss_iden_rcs1s1t.item() + loss_iden_rcs2s2t.item()
p_pred_recon = p_d(recon)
g_pred_recon = g_d(recon)
loss_G_g = CELoss(p_pred_recon, torch.ones_like(p_pred_recon))
loss_G_p = CELoss(g_pred_recon, torch.ones_like(g_pred_recon))
loss_G = 20 * loss_G_g + 30 * loss_G_p + 100 * loss_identy
IdentLoss is my Custom Loss Function
i know that to call loss.backward() object should Tensor
but in composited loss, one of loss value can be float value??
i tried with float value loss_identy no error, but concerned about backpropagation for IdentLoss
even though i used float loss value, it backpropagated??