Add function error

I’ve been trying to implement my own loss function to later perform backprop with. I need to combine two losses into one in the following form: loss_first + B*loss_second = loss_all where loss_first and loss_second are 1-dim tensors and B is a scalar. However, the ‘add’ function is returning the following error:

“add() takes 2 positional arguments but 3 were given”

In the docs, there is an example using 3 positional arguments. Does someone know what might be happening?


    loss_all = torch.add(loss_first, B, loss_second)



I am not sure this version of addition is supported by Variables.
What you can do is:

loss_all = loss_first + B * loss_second
1 Like