Hello all, I have two variables a,b with requires_grad=True. The loss1 will take the a and b as inputs, while the loss2 will take the normalization of a and b as input. So, should I use a clone in this case? This is my implementation

a,b

loss1

a

b

loss2

loss1= loss1(a,b) a_norm = a / 255 b_norm = b/255 loss2 = loss2(a_norm, b_norm) loss_total = loss1+loss2

Hi,

You don’t need to clone or detach here. The code you have will work well.

@albanD: Thanks. But do you think the loss 1 will be received a normalization value after gradient update because we use a_norm= a/ 255 in forward

When you do a_norm = a / 255, you do not modify a in any way.

a_norm = a / 255