How to call loss.backward in a if statement?

I am training a net with hinge loss or somewhat similar. when the loss is zero, the gradient should be automatically zero. However, I want to use L2 regularization. Thus, to save computing time, I use the following method.

optimizer = optim.SGD(parameters, lr =0.001, weight_decay=0.01)

for iter in range(10):

     ptimizer.zero_grad()
     a = model(data)
     loss = somecriterion(a)
     if loss.item() != 0:
         loss.backward()
     optimizer.step()

Thus, in this way, when the loss is zero, I do not need to calculate backprop as the result is zero for sure. And since my optimizer has weight decay, it still trains the model with optimizer.step(). But this causes a memory leak. Any suggestions?`

Hi,

I don’t think this cause a memory leak, but that can increase the peak memory usage.
Indeed the whole history corresponding to your loss is linked to the loss python variable. This used to be cleared when .backward() is called. But when it’s not called, it’s only cleared when the loss variable is overriden. Unfortunately, this only happens after the next forward pass. So the peak memory usage will be much larger.
You can add a del loss when you do not call .backward() to reduce the peak memory usage.