Loss.backward() failure after pruning

I prune a model by setting a certain channel to zero, and then the loss.backward() function isn’t working. It may due to the pruned model can’t be derivated. How can I use BP regardless of the pruned channel?

If you do this for an intermediate, you should multiply with a mask rather than assigning to the value. Quite likely the operation that generated the non-pruned output wants it to compute the backwards. (Search for “inplace” in the forums.)

Best regards

Thomas

Thanks for help, I referred to the inplace operation and solved my problem.