Add losses together

Hi

Suppose I have too many losses that need to be added together and I don’t want to do it in the normal way, i.e. L = l1 + l2...

Can I do something like:

loss = torch.autograd.Variable(torch.FloatTensor([0])).cuda()
for i in range(num):
    loss += l[i]

Is this the correct way to do it?

Many Thanks

Yes if you make sure that loss is on the same device as the losses.

Why not using the normal way?

Hi Simon

Thanks for the reply. cause there are around a hundred losses and I need to select some of them based on a certain criterion. By using the normal way, I have to change the code every time the criterion changes.

Okay, that makes sense. In fact you can even start with just plain python number, i.e. loss = 0.