Assigning a value to the loss function based on a condition


I have a very basic question about Pytorch graph, if I have a condition like this

     calculate the loss by a Pytorch API like BCELoss
     loss = 0

Does such a condition separate the loss from the graph, and ruin gradients and parameters update?

If the STATEMENT is false, it occurs no loss computation(and also no gradients).
So if I read rightly, that will work!

That’s basically right. If you use loss after the condition to call loss.backward() on it, it will create an error though.
I would rather zero out the loss:

loss = calculate the loss
loss = loss * 0.

Alternatively your training code would have to be in the if STATEMENT condition.

Yes, I am using the loss after the if statement in the form of loss.bachward(). Please correct me if I am wrong,
You are suggesting:

loss = calculate the loss
   loss = loss * 0

Am I correct?

Yes, otherwise you are assigning an int to loss, which will throw an error, if you try to call loss.backward().

1 Like