hosseinshn
(Hossein)
September 25, 2018, 12:45am
#1
Hi,
I have a very basic question about Pytorch graph, if I have a condition like this
if STATEMENT:
calculate the loss by a Pytorch API like BCELoss
else:
loss = 0
Does such a condition separate the loss from the graph, and ruin gradients and parameters update?
If the STATEMENT
is false, it occurs no loss computation(and also no gradients).
So if I read rightly, that will work!
ptrblck
September 26, 2018, 2:51pm
#3
That’s basically right. If you use loss
after the condition to call loss.backward()
on it, it will create an error though.
I would rather zero out the loss:
loss = calculate the loss
loss = loss * 0.
Alternatively your training code would have to be in the if STATEMENT
condition.
hosseinshn
(Hossein)
September 26, 2018, 5:11pm
#4
Yes, I am using the loss after the if statement in the form of loss.bachward(). Please correct me if I am wrong,
You are suggesting:
loss = calculate the loss
if ~STATEMENT:
loss = loss * 0
Am I correct?
ptrblck
September 26, 2018, 5:26pm
#5
Yes, otherwise you are assigning an int
to loss
, which will throw an error, if you try to call loss.backward()
.
1 Like