Is Clone/Detach correct for combined loss?

Hello all, I have two variables a,b with requires_grad=True. The loss1 will take the a and b as inputs, while the loss2 will take the normalization of a and b as input. So, should I use a clone in this case? This is my implementation

loss1= loss1(a,b)
a_norm = a / 255
b_norm = b/255
loss2 = loss2(a_norm, b_norm)

loss_total = loss1+loss2

Hi,

You don’t need to clone or detach here. The code you have will work well.

2 Likes

@albanD: Thanks. But do you think the loss 1 will be received a normalization value after gradient update because we use a_norm= a/ 255 in forward

When you do a_norm = a / 255, you do not modify a in any way.